Dec 03 06:45:14 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 06:45:14 crc restorecon[4461]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:14 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:15 crc restorecon[4461]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:45:15 crc restorecon[4461]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 06:45:15 crc kubenswrapper[4475]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:45:15 crc kubenswrapper[4475]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 06:45:15 crc kubenswrapper[4475]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:45:15 crc kubenswrapper[4475]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:45:15 crc kubenswrapper[4475]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 06:45:15 crc kubenswrapper[4475]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.369349 4475 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374418 4475 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374467 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374472 4475 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374477 4475 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374480 4475 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374485 4475 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374489 4475 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374493 4475 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374496 4475 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374500 4475 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374503 4475 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374506 4475 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374510 4475 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374513 4475 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374516 4475 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374520 4475 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374523 4475 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374527 4475 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374530 4475 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374533 4475 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374536 4475 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374539 4475 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374543 4475 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374546 4475 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374553 4475 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374557 4475 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374560 4475 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374563 4475 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374567 4475 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374570 4475 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374573 4475 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374576 4475 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374579 4475 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374584 4475 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374589 4475 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374592 4475 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374596 4475 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374599 4475 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374603 4475 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374607 4475 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374610 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374615 4475 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374619 4475 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374623 4475 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374626 4475 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374630 4475 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374634 4475 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374638 4475 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374642 4475 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374645 4475 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374648 4475 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374653 4475 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374657 4475 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374660 4475 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374664 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374667 4475 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374670 4475 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374674 4475 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374677 4475 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374680 4475 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374683 4475 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374686 4475 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374689 4475 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374693 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374696 4475 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374699 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374703 4475 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374706 4475 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374709 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374712 4475 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.374717 4475 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374788 4475 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374797 4475 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374805 4475 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374810 4475 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374815 4475 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374819 4475 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374824 4475 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374829 4475 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374834 4475 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374838 4475 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374843 4475 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374852 4475 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374856 4475 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374859 4475 flags.go:64] FLAG: --cgroup-root="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374863 4475 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374867 4475 flags.go:64] FLAG: --client-ca-file="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374870 4475 flags.go:64] FLAG: --cloud-config="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374874 4475 flags.go:64] FLAG: --cloud-provider="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374877 4475 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374882 4475 flags.go:64] FLAG: --cluster-domain="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374886 4475 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374890 4475 flags.go:64] FLAG: --config-dir="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374893 4475 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374897 4475 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374902 4475 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374906 4475 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374910 4475 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374914 4475 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374918 4475 flags.go:64] FLAG: --contention-profiling="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374921 4475 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374925 4475 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374929 4475 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374933 4475 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374948 4475 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374952 4475 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374956 4475 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374959 4475 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374963 4475 flags.go:64] FLAG: --enable-server="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374967 4475 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374973 4475 flags.go:64] FLAG: --event-burst="100" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374977 4475 flags.go:64] FLAG: --event-qps="50" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374981 4475 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374985 4475 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374990 4475 flags.go:64] FLAG: --eviction-hard="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374995 4475 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.374999 4475 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375003 4475 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375007 4475 flags.go:64] FLAG: --eviction-soft="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375011 4475 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375015 4475 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375018 4475 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375022 4475 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375025 4475 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375029 4475 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375033 4475 flags.go:64] FLAG: --feature-gates="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375037 4475 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375046 4475 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375050 4475 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375054 4475 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375058 4475 flags.go:64] FLAG: --healthz-port="10248" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375062 4475 flags.go:64] FLAG: --help="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375066 4475 flags.go:64] FLAG: --hostname-override="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375069 4475 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375073 4475 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375077 4475 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375082 4475 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375085 4475 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375089 4475 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375093 4475 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375097 4475 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375100 4475 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375104 4475 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375109 4475 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375113 4475 flags.go:64] FLAG: --kube-reserved="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375117 4475 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375121 4475 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375124 4475 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375128 4475 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375132 4475 flags.go:64] FLAG: --lock-file="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375135 4475 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375139 4475 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375143 4475 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375153 4475 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375157 4475 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375161 4475 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375165 4475 flags.go:64] FLAG: --logging-format="text" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375168 4475 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375172 4475 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375176 4475 flags.go:64] FLAG: --manifest-url="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375180 4475 flags.go:64] FLAG: --manifest-url-header="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375185 4475 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375189 4475 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375194 4475 flags.go:64] FLAG: --max-pods="110" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375197 4475 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375201 4475 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375205 4475 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375209 4475 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375213 4475 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375216 4475 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375220 4475 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375230 4475 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375234 4475 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375237 4475 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375241 4475 flags.go:64] FLAG: --pod-cidr="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375245 4475 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375251 4475 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375254 4475 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375258 4475 flags.go:64] FLAG: --pods-per-core="0" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375262 4475 flags.go:64] FLAG: --port="10250" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375265 4475 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375269 4475 flags.go:64] FLAG: --provider-id="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375273 4475 flags.go:64] FLAG: --qos-reserved="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375276 4475 flags.go:64] FLAG: --read-only-port="10255" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375280 4475 flags.go:64] FLAG: --register-node="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375284 4475 flags.go:64] FLAG: --register-schedulable="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375287 4475 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375294 4475 flags.go:64] FLAG: --registry-burst="10" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375298 4475 flags.go:64] FLAG: --registry-qps="5" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375301 4475 flags.go:64] FLAG: --reserved-cpus="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375305 4475 flags.go:64] FLAG: --reserved-memory="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375310 4475 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375314 4475 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375317 4475 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375321 4475 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375325 4475 flags.go:64] FLAG: --runonce="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375328 4475 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375332 4475 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375336 4475 flags.go:64] FLAG: --seccomp-default="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375340 4475 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375344 4475 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375348 4475 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375352 4475 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375356 4475 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375360 4475 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375364 4475 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375367 4475 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375371 4475 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375375 4475 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375379 4475 flags.go:64] FLAG: --system-cgroups="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375383 4475 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375388 4475 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375392 4475 flags.go:64] FLAG: --tls-cert-file="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375396 4475 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375401 4475 flags.go:64] FLAG: --tls-min-version="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375405 4475 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375408 4475 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375412 4475 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375416 4475 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375420 4475 flags.go:64] FLAG: --v="2" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375425 4475 flags.go:64] FLAG: --version="false" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375429 4475 flags.go:64] FLAG: --vmodule="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375435 4475 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375439 4475 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375548 4475 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375553 4475 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375557 4475 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375560 4475 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375564 4475 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375567 4475 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375573 4475 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375577 4475 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375580 4475 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375583 4475 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375587 4475 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375590 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375593 4475 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375596 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375599 4475 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375603 4475 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375606 4475 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375609 4475 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375612 4475 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375615 4475 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375619 4475 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375622 4475 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375625 4475 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375628 4475 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375632 4475 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375635 4475 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375638 4475 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375641 4475 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375645 4475 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375648 4475 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375651 4475 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375656 4475 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375660 4475 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375663 4475 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375666 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375669 4475 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375672 4475 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375675 4475 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375683 4475 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375686 4475 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375690 4475 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375693 4475 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375696 4475 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375699 4475 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375703 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375707 4475 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375711 4475 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375715 4475 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375719 4475 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375723 4475 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375726 4475 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375730 4475 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375734 4475 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375738 4475 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375741 4475 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375745 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375748 4475 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375752 4475 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375755 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375759 4475 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375764 4475 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375769 4475 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375772 4475 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375776 4475 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375779 4475 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375782 4475 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375786 4475 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375789 4475 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375792 4475 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375795 4475 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.375800 4475 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.375810 4475 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.381303 4475 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.381324 4475 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381391 4475 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381402 4475 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381406 4475 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381410 4475 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381414 4475 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381417 4475 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381420 4475 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381424 4475 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381427 4475 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381430 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381434 4475 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381438 4475 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381441 4475 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381445 4475 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381460 4475 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381464 4475 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381467 4475 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381470 4475 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381474 4475 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381479 4475 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381483 4475 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381486 4475 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381490 4475 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381493 4475 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381496 4475 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381499 4475 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381503 4475 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381507 4475 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381511 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381515 4475 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381519 4475 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381523 4475 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381527 4475 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381531 4475 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381541 4475 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381545 4475 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381548 4475 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381552 4475 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381555 4475 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381558 4475 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381561 4475 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381564 4475 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381567 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381570 4475 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381573 4475 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381577 4475 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381580 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381583 4475 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381586 4475 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381590 4475 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381593 4475 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381596 4475 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381600 4475 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381604 4475 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381607 4475 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381610 4475 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381613 4475 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381618 4475 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381621 4475 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381625 4475 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381628 4475 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381631 4475 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381634 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381637 4475 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381640 4475 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381643 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381646 4475 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381649 4475 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381652 4475 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381655 4475 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381660 4475 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.381666 4475 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381932 4475 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381952 4475 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381956 4475 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381960 4475 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381964 4475 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381967 4475 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381971 4475 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381974 4475 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381979 4475 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381984 4475 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381987 4475 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381991 4475 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381994 4475 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.381998 4475 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382002 4475 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382005 4475 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382008 4475 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382012 4475 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382016 4475 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382019 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382022 4475 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382025 4475 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382029 4475 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382032 4475 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382035 4475 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382039 4475 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382042 4475 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382045 4475 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382048 4475 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382051 4475 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382055 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382058 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382062 4475 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382065 4475 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382068 4475 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382072 4475 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382075 4475 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382078 4475 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382081 4475 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382084 4475 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382088 4475 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382091 4475 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382094 4475 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382097 4475 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382100 4475 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382104 4475 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382107 4475 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382110 4475 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382113 4475 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382116 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382119 4475 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382122 4475 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382126 4475 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382129 4475 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382132 4475 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382135 4475 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382139 4475 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382143 4475 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382147 4475 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382150 4475 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382153 4475 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382157 4475 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382160 4475 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382163 4475 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382166 4475 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382169 4475 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382172 4475 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382176 4475 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382180 4475 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382184 4475 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.382188 4475 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.382194 4475 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.382302 4475 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.384878 4475 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.384952 4475 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.385709 4475 server.go:997] "Starting client certificate rotation" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.385736 4475 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.385860 4475 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-18 06:41:01.127186877 +0000 UTC Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.385905 4475 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1103h55m45.741283703s for next certificate rotation Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.396292 4475 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.398535 4475 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.405994 4475 log.go:25] "Validated CRI v1 runtime API" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.425127 4475 log.go:25] "Validated CRI v1 image API" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.426040 4475 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.429796 4475 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-06-41-47-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.429820 4475 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.438527 4475 manager.go:217] Machine: {Timestamp:2025-12-03 06:45:15.437411865 +0000 UTC m=+0.242310219 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2445404 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6c3f70a9-a9d8-4b80-a825-7a6426aa17aa BootID:b860fac6-8533-4b4b-bdad-0cb0561d1495 Filesystems:[{Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:3076108 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:12:26:77 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:12:26:77 Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:33:ee:ff Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:72:a8:7a Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:6d:47:98 Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:94:0a:43 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:36:29:68:5c:db:24 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0e:d8:c0:1d:76:43 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.438660 4475 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.438767 4475 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.439407 4475 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.439565 4475 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.439592 4475 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.439728 4475 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.439736 4475 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.440119 4475 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.440138 4475 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.440521 4475 state_mem.go:36] "Initialized new in-memory state store" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.440590 4475 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.442613 4475 kubelet.go:418] "Attempting to sync node with API server" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.442630 4475 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.442666 4475 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.442677 4475 kubelet.go:324] "Adding apiserver pod source" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.442686 4475 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.444404 4475 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.444831 4475 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.446015 4475 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.446048 4475 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.177:6443: connect: connection refused Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.446064 4475 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.177:6443: connect: connection refused Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.446386 4475 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.177:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.446390 4475 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.177:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.446915 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.446948 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.446955 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.446962 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.446972 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.446978 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.446990 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.446999 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.447005 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.447012 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.447032 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.447039 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.447392 4475 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.447732 4475 server.go:1280] "Started kubelet" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.448031 4475 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.177:6443: connect: connection refused Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.448220 4475 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.448229 4475 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 06:45:15 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.451174 4475 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.452012 4475 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.452030 4475 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.453644 4475 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 07:42:18.236863903 +0000 UTC Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.453876 4475 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.453974 4475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" interval="200ms" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.454087 4475 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.455174 4475 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.455276 4475 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.455524 4475 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.177:6443: connect: connection refused Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.455561 4475 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.177:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.455240 4475 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.25.177:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187da19ad9a69547 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 06:45:15.447711047 +0000 UTC m=+0.252609381,LastTimestamp:2025-12-03 06:45:15.447711047 +0000 UTC m=+0.252609381,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.456630 4475 factory.go:153] Registering CRI-O factory Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.456655 4475 factory.go:221] Registration of the crio container factory successfully Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.456711 4475 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.456720 4475 factory.go:55] Registering systemd factory Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.456726 4475 factory.go:221] Registration of the systemd container factory successfully Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.456747 4475 factory.go:103] Registering Raw factory Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.456759 4475 manager.go:1196] Started watching for new ooms in manager Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.459326 4475 server.go:460] "Adding debug handlers to kubelet server" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462244 4475 manager.go:319] Starting recovery of all containers Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462621 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462652 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462664 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462672 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462681 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462690 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462698 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462725 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462736 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462744 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462752 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462760 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462768 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462777 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462784 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462791 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462821 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462829 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462837 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462846 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462854 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462862 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462869 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462877 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462886 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462893 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462903 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462911 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462919 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462927 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462957 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462965 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462972 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462979 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462986 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.462995 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463003 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463010 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463017 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463025 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463032 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463041 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463048 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463056 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463073 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463081 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463088 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463098 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463106 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463114 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463122 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463129 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463141 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463149 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463157 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463166 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463173 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463181 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463188 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463195 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463204 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463211 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463218 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463226 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463233 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463241 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463248 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463255 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463263 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463270 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463277 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463286 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463293 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463301 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463309 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463317 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463324 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463332 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463339 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463346 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463353 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463361 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463368 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463375 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463383 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463390 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463398 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463406 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463413 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463421 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463428 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463435 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463443 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463490 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463498 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463505 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463512 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463520 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463527 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463535 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463542 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463549 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463557 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463568 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463585 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463594 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463603 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463613 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463621 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463629 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463637 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463645 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463653 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463661 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463669 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463676 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463684 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463692 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463700 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463708 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463716 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463723 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463731 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463739 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463745 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463753 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463760 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463767 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463774 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463782 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463790 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463797 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463804 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463812 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463819 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463829 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463837 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463844 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463852 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463859 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463866 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463873 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463880 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463887 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463894 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463901 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463908 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463915 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463922 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463930 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463949 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463957 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463964 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.463971 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.464918 4475 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.464950 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.464961 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.464970 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.464977 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.464985 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.464992 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465000 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465007 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465014 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465021 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465029 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465036 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465044 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465052 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465058 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465066 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465073 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465080 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465087 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465094 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465101 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465108 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465115 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465122 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465129 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465137 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465144 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465162 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465172 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465180 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465187 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465195 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465202 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465209 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465216 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465223 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465230 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465237 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465245 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465252 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465259 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465266 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465274 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465282 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465290 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465297 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465305 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465312 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465321 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465328 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465335 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465342 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465349 4475 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465356 4475 reconstruct.go:97] "Volume reconstruction finished" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.465362 4475 reconciler.go:26] "Reconciler: start to sync state" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.480751 4475 manager.go:324] Recovery completed Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.488618 4475 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.489964 4475 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.489994 4475 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.490011 4475 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.490047 4475 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.490530 4475 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.177:6443: connect: connection refused Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.490733 4475 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.177:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.492045 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.493019 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.493046 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.493055 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.493736 4475 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.493751 4475 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.493782 4475 state_mem.go:36] "Initialized new in-memory state store" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.498135 4475 policy_none.go:49] "None policy: Start" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.499255 4475 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.499280 4475 state_mem.go:35] "Initializing new in-memory state store" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.538113 4475 manager.go:334] "Starting Device Plugin manager" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.538165 4475 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.538177 4475 server.go:79] "Starting device plugin registration server" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.538424 4475 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.538437 4475 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.538561 4475 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.538662 4475 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.538672 4475 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.543422 4475 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.590808 4475 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.590900 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.591529 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.591567 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.591600 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.591702 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.591909 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.591976 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592290 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592312 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592320 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592373 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592472 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592502 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592479 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592568 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592580 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592796 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592820 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592829 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592886 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592966 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.592989 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593119 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593146 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593156 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593446 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593478 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593487 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593510 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593523 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593531 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593616 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593710 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.593731 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.594131 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.594148 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.594155 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.594208 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.594224 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.594232 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.594267 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.594297 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.595314 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.595337 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.595346 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.638859 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.639458 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.639476 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.639483 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.639496 4475 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.639750 4475 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.177:6443: connect: connection refused" node="crc" Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.655100 4475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" interval="400ms" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667406 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667433 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667463 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667477 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667491 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667509 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667533 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667547 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667589 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667640 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667657 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667669 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667681 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667710 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.667728 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768690 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768721 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768737 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768757 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768767 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768776 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768791 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768805 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768805 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768829 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768834 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768841 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768853 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768855 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768879 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768870 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768880 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768860 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768958 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768981 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.768998 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.769012 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.769015 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.769040 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.769042 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.769027 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.769070 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.769061 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.769121 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.769167 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.840536 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.841435 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.841485 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.841495 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.841514 4475 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:45:15 crc kubenswrapper[4475]: E1203 06:45:15.841740 4475 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.177:6443: connect: connection refused" node="crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.918613 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.934147 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.942291 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e0379896f6d5486f68ed836515ede5242677bda15e39b7191a45f8a2dea97799 WatchSource:0}: Error finding container e0379896f6d5486f68ed836515ede5242677bda15e39b7191a45f8a2dea97799: Status 404 returned error can't find the container with id e0379896f6d5486f68ed836515ede5242677bda15e39b7191a45f8a2dea97799 Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.946672 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-05e3119f7ad231b6529c38fa2f5fd05ba51209c37e116b54f90384fa4df1699d WatchSource:0}: Error finding container 05e3119f7ad231b6529c38fa2f5fd05ba51209c37e116b54f90384fa4df1699d: Status 404 returned error can't find the container with id 05e3119f7ad231b6529c38fa2f5fd05ba51209c37e116b54f90384fa4df1699d Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.959898 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.967034 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-2b194b4c068f4649821d7ffb4c96b7a3e63c476947777bce1f52e4f742d2b9fc WatchSource:0}: Error finding container 2b194b4c068f4649821d7ffb4c96b7a3e63c476947777bce1f52e4f742d2b9fc: Status 404 returned error can't find the container with id 2b194b4c068f4649821d7ffb4c96b7a3e63c476947777bce1f52e4f742d2b9fc Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.977565 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: I1203 06:45:15.982086 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.983356 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-4b7e418d49daaee50e5810f48332ae9ade281f63df43fae3bee22b833c6932db WatchSource:0}: Error finding container 4b7e418d49daaee50e5810f48332ae9ade281f63df43fae3bee22b833c6932db: Status 404 returned error can't find the container with id 4b7e418d49daaee50e5810f48332ae9ade281f63df43fae3bee22b833c6932db Dec 03 06:45:15 crc kubenswrapper[4475]: W1203 06:45:15.991562 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c6a85a067cee53d23476f4f4afa170af0a765f483cb462f9d18d1fbdf020d4a2 WatchSource:0}: Error finding container c6a85a067cee53d23476f4f4afa170af0a765f483cb462f9d18d1fbdf020d4a2: Status 404 returned error can't find the container with id c6a85a067cee53d23476f4f4afa170af0a765f483cb462f9d18d1fbdf020d4a2 Dec 03 06:45:16 crc kubenswrapper[4475]: E1203 06:45:16.055989 4475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" interval="800ms" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.241968 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.242978 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.243009 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.243020 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.243038 4475 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:45:16 crc kubenswrapper[4475]: E1203 06:45:16.243318 4475 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.177:6443: connect: connection refused" node="crc" Dec 03 06:45:16 crc kubenswrapper[4475]: W1203 06:45:16.426770 4475 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.177:6443: connect: connection refused Dec 03 06:45:16 crc kubenswrapper[4475]: E1203 06:45:16.426831 4475 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.177:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:45:16 crc kubenswrapper[4475]: W1203 06:45:16.428432 4475 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.177:6443: connect: connection refused Dec 03 06:45:16 crc kubenswrapper[4475]: E1203 06:45:16.428921 4475 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.177:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.448707 4475 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.177:6443: connect: connection refused Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.453748 4475 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 03:09:34.015351603 +0000 UTC Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.453793 4475 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 740h24m17.561560576s for next certificate rotation Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.497393 4475 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628" exitCode=0 Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.497490 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628"} Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.497583 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e0379896f6d5486f68ed836515ede5242677bda15e39b7191a45f8a2dea97799"} Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.497679 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.499511 4475 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92" exitCode=0 Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.500051 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92"} Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.500077 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.500099 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.500081 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c6a85a067cee53d23476f4f4afa170af0a765f483cb462f9d18d1fbdf020d4a2"} Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.500113 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.501482 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.502233 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.502261 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.502275 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.503161 4475 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5b6792eda45350de7bec3afa78ce07a0e2cdfbd590de58c7760b52a6262d31be" exitCode=0 Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.503213 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5b6792eda45350de7bec3afa78ce07a0e2cdfbd590de58c7760b52a6262d31be"} Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.503237 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4b7e418d49daaee50e5810f48332ae9ade281f63df43fae3bee22b833c6932db"} Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.503286 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.504015 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.505936 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.505963 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.505973 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.506934 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.506952 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.506960 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.507205 4475 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c" exitCode=0 Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.507230 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c"} Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.507255 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2b194b4c068f4649821d7ffb4c96b7a3e63c476947777bce1f52e4f742d2b9fc"} Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.507311 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.508072 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.508094 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.508103 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.508804 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608"} Dec 03 06:45:16 crc kubenswrapper[4475]: I1203 06:45:16.508828 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"05e3119f7ad231b6529c38fa2f5fd05ba51209c37e116b54f90384fa4df1699d"} Dec 03 06:45:16 crc kubenswrapper[4475]: W1203 06:45:16.765921 4475 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.177:6443: connect: connection refused Dec 03 06:45:16 crc kubenswrapper[4475]: E1203 06:45:16.765978 4475 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.177:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:45:16 crc kubenswrapper[4475]: E1203 06:45:16.856473 4475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" interval="1.6s" Dec 03 06:45:16 crc kubenswrapper[4475]: W1203 06:45:16.889321 4475 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.177:6443: connect: connection refused Dec 03 06:45:16 crc kubenswrapper[4475]: E1203 06:45:16.889388 4475 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.177:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.043931 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.047543 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.047571 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.047579 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.047599 4475 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:45:17 crc kubenswrapper[4475]: E1203 06:45:17.048217 4475 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.177:6443: connect: connection refused" node="crc" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.448408 4475 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.177:6443: connect: connection refused Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.512509 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.512709 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.512720 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.512542 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.513321 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.513347 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.513357 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.514913 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.514974 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.514985 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.514993 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.515001 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.515070 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.515694 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.515715 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.515724 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.516193 4475 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b" exitCode=0 Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.516234 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.516253 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.516762 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.516781 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.516802 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.518552 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"072fa6c7aacc3c3456def3a63e3be47a7f7240db68bf852c99d20a3129858fad"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.518630 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.519245 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.519264 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.519272 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.520635 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.520660 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.520670 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de"} Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.520718 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.521136 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.521156 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.521164 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:17 crc kubenswrapper[4475]: I1203 06:45:17.547710 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.450946 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.523690 4475 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888" exitCode=0 Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.523783 4475 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.523810 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.524180 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.524335 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888"} Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.524407 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.524533 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.524559 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.524569 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.525234 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.525262 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.525271 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.525236 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.525294 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.525303 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.649097 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.649723 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.649753 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.649762 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:18 crc kubenswrapper[4475]: I1203 06:45:18.649794 4475 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.529014 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.529014 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.529012 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172"} Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.529391 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703"} Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.529405 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc"} Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.529413 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902"} Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.529429 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b"} Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.529713 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.529733 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.529741 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.530197 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.530209 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:19 crc kubenswrapper[4475]: I1203 06:45:19.530215 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:20 crc kubenswrapper[4475]: I1203 06:45:20.530703 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:20 crc kubenswrapper[4475]: I1203 06:45:20.531287 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:20 crc kubenswrapper[4475]: I1203 06:45:20.531321 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:20 crc kubenswrapper[4475]: I1203 06:45:20.531331 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:20 crc kubenswrapper[4475]: I1203 06:45:20.972668 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 06:45:21 crc kubenswrapper[4475]: I1203 06:45:21.532329 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:21 crc kubenswrapper[4475]: I1203 06:45:21.533371 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:21 crc kubenswrapper[4475]: I1203 06:45:21.533404 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:21 crc kubenswrapper[4475]: I1203 06:45:21.533413 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:21 crc kubenswrapper[4475]: I1203 06:45:21.682572 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:21 crc kubenswrapper[4475]: I1203 06:45:21.682663 4475 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:45:21 crc kubenswrapper[4475]: I1203 06:45:21.682702 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:21 crc kubenswrapper[4475]: I1203 06:45:21.683508 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:21 crc kubenswrapper[4475]: I1203 06:45:21.683533 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:21 crc kubenswrapper[4475]: I1203 06:45:21.683543 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:22 crc kubenswrapper[4475]: I1203 06:45:22.538377 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:22 crc kubenswrapper[4475]: I1203 06:45:22.538543 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:22 crc kubenswrapper[4475]: I1203 06:45:22.539390 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:22 crc kubenswrapper[4475]: I1203 06:45:22.539425 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:22 crc kubenswrapper[4475]: I1203 06:45:22.539434 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:22 crc kubenswrapper[4475]: I1203 06:45:22.602732 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:45:22 crc kubenswrapper[4475]: I1203 06:45:22.602819 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:22 crc kubenswrapper[4475]: I1203 06:45:22.603543 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:22 crc kubenswrapper[4475]: I1203 06:45:22.603573 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:22 crc kubenswrapper[4475]: I1203 06:45:22.603581 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:23 crc kubenswrapper[4475]: I1203 06:45:23.732477 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:23 crc kubenswrapper[4475]: I1203 06:45:23.732618 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:23 crc kubenswrapper[4475]: I1203 06:45:23.733356 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:23 crc kubenswrapper[4475]: I1203 06:45:23.733386 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:23 crc kubenswrapper[4475]: I1203 06:45:23.733394 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.442064 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.442202 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.443010 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.443036 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.443044 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.538395 4475 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.538433 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 06:45:25 crc kubenswrapper[4475]: E1203 06:45:25.543555 4475 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.946437 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.946569 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.947282 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.947309 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:25 crc kubenswrapper[4475]: I1203 06:45:25.947318 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:26 crc kubenswrapper[4475]: I1203 06:45:26.547288 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:26 crc kubenswrapper[4475]: I1203 06:45:26.547867 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:26 crc kubenswrapper[4475]: I1203 06:45:26.548727 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:26 crc kubenswrapper[4475]: I1203 06:45:26.548761 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:26 crc kubenswrapper[4475]: I1203 06:45:26.548771 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:26 crc kubenswrapper[4475]: I1203 06:45:26.551238 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:27 crc kubenswrapper[4475]: I1203 06:45:27.368093 4475 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 06:45:27 crc kubenswrapper[4475]: I1203 06:45:27.368323 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 06:45:27 crc kubenswrapper[4475]: I1203 06:45:27.372096 4475 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 06:45:27 crc kubenswrapper[4475]: I1203 06:45:27.372126 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 06:45:27 crc kubenswrapper[4475]: I1203 06:45:27.542885 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:27 crc kubenswrapper[4475]: I1203 06:45:27.543692 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:27 crc kubenswrapper[4475]: I1203 06:45:27.543721 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:27 crc kubenswrapper[4475]: I1203 06:45:27.543731 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:27 crc kubenswrapper[4475]: I1203 06:45:27.547265 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:28 crc kubenswrapper[4475]: I1203 06:45:28.544177 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:28 crc kubenswrapper[4475]: I1203 06:45:28.544794 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:28 crc kubenswrapper[4475]: I1203 06:45:28.544834 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:28 crc kubenswrapper[4475]: I1203 06:45:28.544845 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:31 crc kubenswrapper[4475]: I1203 06:45:31.686754 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:31 crc kubenswrapper[4475]: I1203 06:45:31.686873 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:31 crc kubenswrapper[4475]: I1203 06:45:31.687229 4475 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 06:45:31 crc kubenswrapper[4475]: I1203 06:45:31.687278 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 06:45:31 crc kubenswrapper[4475]: I1203 06:45:31.687615 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:31 crc kubenswrapper[4475]: I1203 06:45:31.687639 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:31 crc kubenswrapper[4475]: I1203 06:45:31.687648 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:31 crc kubenswrapper[4475]: I1203 06:45:31.689916 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.356121 4475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.357314 4475 trace.go:236] Trace[1393465951]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:45:19.553) (total time: 12803ms): Dec 03 06:45:32 crc kubenswrapper[4475]: Trace[1393465951]: ---"Objects listed" error: 12803ms (06:45:32.357) Dec 03 06:45:32 crc kubenswrapper[4475]: Trace[1393465951]: [12.803411402s] [12.803411402s] END Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.357340 4475 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.358642 4475 trace.go:236] Trace[2097618418]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:45:18.544) (total time: 13814ms): Dec 03 06:45:32 crc kubenswrapper[4475]: Trace[2097618418]: ---"Objects listed" error: 13814ms (06:45:32.358) Dec 03 06:45:32 crc kubenswrapper[4475]: Trace[2097618418]: [13.814054104s] [13.814054104s] END Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.358660 4475 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.358700 4475 trace.go:236] Trace[223948837]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:45:19.234) (total time: 13123ms): Dec 03 06:45:32 crc kubenswrapper[4475]: Trace[223948837]: ---"Objects listed" error: 13123ms (06:45:32.358) Dec 03 06:45:32 crc kubenswrapper[4475]: Trace[223948837]: [13.123724526s] [13.123724526s] END Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.358714 4475 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.359077 4475 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.359133 4475 trace.go:236] Trace[162538887]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:45:18.078) (total time: 14281ms): Dec 03 06:45:32 crc kubenswrapper[4475]: Trace[162538887]: ---"Objects listed" error: 14280ms (06:45:32.359) Dec 03 06:45:32 crc kubenswrapper[4475]: Trace[162538887]: [14.281032867s] [14.281032867s] END Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.359145 4475 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.360293 4475 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.451678 4475 apiserver.go:52] "Watching apiserver" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.453635 4475 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.453772 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.454149 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.454159 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.454261 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.454278 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.454389 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.454672 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.454828 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.454836 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.454985 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.455700 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.455918 4475 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.456009 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.456163 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.456330 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.456427 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.456633 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.456790 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.457101 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.457148 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459405 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459430 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459470 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459487 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459501 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459516 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459531 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459544 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459559 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459574 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459587 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459616 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459630 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459644 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459659 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459677 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459688 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459702 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459754 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459777 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459791 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459804 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459825 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459848 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459864 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459879 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459875 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459894 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459910 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459924 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459938 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459952 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459970 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459983 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459986 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.459997 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460012 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460025 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460038 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460051 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460065 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460080 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460105 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460120 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460133 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460147 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460153 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460160 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460191 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460207 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460221 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460253 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460262 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460268 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460297 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460304 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460314 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460330 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460345 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460362 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460376 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460399 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460413 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460428 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460442 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460476 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460492 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460505 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460520 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460535 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460549 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460563 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460576 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460590 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460602 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460619 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460635 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460650 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460666 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460679 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460693 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460762 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460781 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460796 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460811 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460827 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460841 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460854 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460874 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460888 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460901 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460938 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460954 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460969 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460983 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460996 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461023 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461037 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461053 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461071 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461096 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461111 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461125 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461139 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461153 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461170 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461184 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461199 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461213 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461226 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461240 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461254 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461290 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461304 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461331 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461345 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461360 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461375 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461390 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461404 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461419 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461433 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461473 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461490 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461506 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461520 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461534 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461548 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461562 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461577 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461591 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461605 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461619 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461637 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461651 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461668 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461681 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461695 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461729 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461748 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461763 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461778 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461793 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461809 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461822 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461837 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461851 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461866 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461881 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461896 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461915 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461930 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461946 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461963 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461979 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461994 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462009 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462023 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462037 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462052 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462067 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462094 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462109 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462123 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462138 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462153 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462169 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462185 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462200 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462215 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462229 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462244 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462260 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462274 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462288 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462302 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462319 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462333 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462349 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462365 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462380 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462396 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462411 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462430 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462471 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462488 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462505 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462520 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462535 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462549 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462564 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462579 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462593 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462608 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462622 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462637 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462666 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462682 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462697 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462712 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462728 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462743 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462770 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462789 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462806 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462826 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462842 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462857 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462874 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462891 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462906 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462923 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462938 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462953 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462991 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463007 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463040 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463051 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463060 4475 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463070 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463078 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463098 4475 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463107 4475 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460381 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460467 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460525 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460526 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460591 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460667 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460745 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460784 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460887 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.460912 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461117 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461246 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461385 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461535 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461552 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461696 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461862 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.461937 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462031 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462320 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462463 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.462886 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463042 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463105 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463157 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.463164 4475 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.475076 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:32.975064701 +0000 UTC m=+17.779963035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.475617 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.475954 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.475967 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.476247 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.476489 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.476704 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.476795 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.476952 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.477030 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.477165 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.477170 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463422 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463594 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463559 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463720 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463847 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463919 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463983 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.464130 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.464248 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.465017 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.465234 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.465380 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.465719 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.477338 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.466926 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.467120 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.467671 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.467898 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.468058 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.468468 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.468690 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.469004 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.469204 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.469344 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.469558 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.477421 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.469755 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.470068 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.470280 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.470428 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.470594 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.470737 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.470880 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.471024 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.471208 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.472552 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.472777 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.472874 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.472880 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.472977 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.473047 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.473181 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.473218 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.473428 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.473572 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.473580 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.473886 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.473940 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474008 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474057 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474120 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474248 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474269 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474382 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474397 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474417 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474430 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474691 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474758 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474858 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.474867 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.466120 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.477575 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.477431 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.477730 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.477779 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.477989 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.478046 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.478334 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.478523 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.478576 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.478814 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.478826 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.478857 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.478927 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.479129 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.477181 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.479329 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.479636 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.480258 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.480586 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.480596 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.480908 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.481314 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.481427 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.481579 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.481710 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.481975 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.482677 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.482665 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.482753 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.482736 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.482788 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.482895 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.482976 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.483003 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.483217 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.483266 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.483419 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.483841 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.484010 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.484299 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.484312 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.484549 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.484626 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.484679 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.484868 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.484954 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.485049 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.485163 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:45:32.985150234 +0000 UTC m=+17.790048568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.485514 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.485685 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.485783 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.485907 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.485962 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.463279 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.486475 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.486643 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.488075 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.488348 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.488348 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.488364 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.488375 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.488685 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.488700 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489072 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489159 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489176 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489186 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489215 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489222 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489232 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489420 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489514 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489620 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489663 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.489799 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.490044 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.490201 4475 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.490365 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.490617 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.490820 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.490952 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.491290 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.493082 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.493615 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.494601 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.494590 4475 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.494849 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:32.994832988 +0000 UTC m=+17.799731322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.494976 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.495048 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.495123 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.503350 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.505352 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.505717 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.506111 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.506334 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.510221 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.511166 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.511186 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.511197 4475 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.511249 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:33.011236772 +0000 UTC m=+17.816135106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.511276 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.511443 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.511490 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.511499 4475 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:32 crc kubenswrapper[4475]: E1203 06:45:32.511528 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:33.011519747 +0000 UTC m=+17.816418081 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.517334 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.521108 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.521420 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.521750 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.521783 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.522069 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.524080 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.529830 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.536524 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.541583 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.542864 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.546246 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.549219 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.549514 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.554761 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.561229 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.563914 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.563948 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564019 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564111 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564128 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564138 4475 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564146 4475 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564154 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564163 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564171 4475 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564179 4475 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564186 4475 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564193 4475 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564201 4475 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564208 4475 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564215 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564223 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564226 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564230 4475 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564249 4475 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564260 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564268 4475 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564275 4475 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564283 4475 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564292 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564302 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564310 4475 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564317 4475 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564324 4475 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564332 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564339 4475 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564347 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564354 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564363 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564371 4475 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564396 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564404 4475 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564411 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564419 4475 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564427 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564436 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564444 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564467 4475 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564475 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564486 4475 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564493 4475 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564500 4475 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564508 4475 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564515 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564522 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564529 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564536 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564543 4475 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564551 4475 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564559 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564566 4475 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564574 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564581 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564589 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564597 4475 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564605 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564612 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564619 4475 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564626 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564633 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564654 4475 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564661 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564669 4475 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564681 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564689 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564703 4475 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564710 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564717 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564725 4475 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564732 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564740 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564747 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564755 4475 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564762 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564769 4475 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564777 4475 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564783 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564790 4475 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564799 4475 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564805 4475 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564813 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564820 4475 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564827 4475 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564834 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564841 4475 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564849 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564856 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564864 4475 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564872 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564879 4475 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564886 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564894 4475 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564902 4475 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564909 4475 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564916 4475 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564923 4475 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564930 4475 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564939 4475 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564946 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564957 4475 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564964 4475 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564971 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564978 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564985 4475 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564992 4475 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.564998 4475 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565006 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565013 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565020 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565026 4475 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565033 4475 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565041 4475 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565048 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565055 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565063 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565070 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565076 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565091 4475 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565099 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565106 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565113 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565120 4475 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565133 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565148 4475 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565155 4475 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565162 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565169 4475 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565177 4475 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565184 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565192 4475 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565199 4475 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565206 4475 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565215 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565222 4475 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565229 4475 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565237 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565243 4475 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565250 4475 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565257 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565264 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565274 4475 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565282 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565290 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565297 4475 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565304 4475 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565311 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565319 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565327 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565335 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565342 4475 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565350 4475 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565356 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565364 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565371 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565378 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565385 4475 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565392 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565400 4475 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565408 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565416 4475 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565422 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565429 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565437 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565444 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565470 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565478 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565485 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565492 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565499 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565506 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565513 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565520 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565529 4475 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565537 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565543 4475 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565550 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565558 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565566 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565574 4475 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565581 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565588 4475 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565596 4475 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565603 4475 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565609 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565616 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.565623 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.567242 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.574015 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.581067 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.587404 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.594142 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.605438 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.612115 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.650358 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.776396 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.785985 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:45:32 crc kubenswrapper[4475]: W1203 06:45:32.788700 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-6f6f0ad7a419ac5bb57a732f9bfeec15f3c2c9bc08a46457acefd5d4f582577d WatchSource:0}: Error finding container 6f6f0ad7a419ac5bb57a732f9bfeec15f3c2c9bc08a46457acefd5d4f582577d: Status 404 returned error can't find the container with id 6f6f0ad7a419ac5bb57a732f9bfeec15f3c2c9bc08a46457acefd5d4f582577d Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.793002 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.796833 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: W1203 06:45:32.801348 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-874a15a7252fa5575d66800baef4f549991c11a27ee5a446b14da773c55fd64c WatchSource:0}: Error finding container 874a15a7252fa5575d66800baef4f549991c11a27ee5a446b14da773c55fd64c: Status 404 returned error can't find the container with id 874a15a7252fa5575d66800baef4f549991c11a27ee5a446b14da773c55fd64c Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.807478 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.807688 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.827306 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: W1203 06:45:32.834258 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-68fdb0d6df9cca11128104901c82cb70a69c6bbd845fe0f19cd78eb280d109b8 WatchSource:0}: Error finding container 68fdb0d6df9cca11128104901c82cb70a69c6bbd845fe0f19cd78eb280d109b8: Status 404 returned error can't find the container with id 68fdb0d6df9cca11128104901c82cb70a69c6bbd845fe0f19cd78eb280d109b8 Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.850713 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.861155 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.886260 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.906849 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:32 crc kubenswrapper[4475]: I1203 06:45:32.918353 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.067632 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.067699 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.067720 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.067736 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.067754 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067780 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:45:34.067755221 +0000 UTC m=+18.872653565 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067828 4475 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067841 4475 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067900 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067856 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:34.067849058 +0000 UTC m=+18.872747392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067913 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067925 4475 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067934 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:34.067917488 +0000 UTC m=+18.872815832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067859 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067952 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:34.067943908 +0000 UTC m=+18.872842252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067959 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.067973 4475 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:33 crc kubenswrapper[4475]: E1203 06:45:33.068012 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:34.068000454 +0000 UTC m=+18.872898788 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.493498 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.493932 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.494563 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.495077 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.495581 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.495987 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.496506 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.496957 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.497498 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.497930 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.498358 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.500012 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.500522 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.500974 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.501427 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.501883 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.502383 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.502748 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.503215 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.503724 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.504154 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.504688 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.505055 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.507492 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.507853 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.508773 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.509323 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.510297 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.511193 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.511604 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.512322 4475 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.512416 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.513805 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.514832 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.515223 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.516710 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.517321 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.518128 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.518721 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.519629 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.520161 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.521160 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.522035 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.522735 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.523235 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.524010 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.524821 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.525640 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.526210 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.527030 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.527555 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.528373 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.528864 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.529275 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.555192 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27"} Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.555228 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6f6f0ad7a419ac5bb57a732f9bfeec15f3c2c9bc08a46457acefd5d4f582577d"} Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.555792 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"68fdb0d6df9cca11128104901c82cb70a69c6bbd845fe0f19cd78eb280d109b8"} Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.556993 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c"} Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.557034 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22"} Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.557046 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"874a15a7252fa5575d66800baef4f549991c11a27ee5a446b14da773c55fd64c"} Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.564052 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.576359 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.586044 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.595506 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.606597 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.616033 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.626505 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.633925 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.641620 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.650122 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.657869 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.665830 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.675389 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.683000 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.696274 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:33 crc kubenswrapper[4475]: I1203 06:45:33.711687 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.073192 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.073267 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.073286 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.073303 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.073325 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073372 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:45:36.073351746 +0000 UTC m=+20.878250079 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073392 4475 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073410 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073420 4475 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073436 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073473 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073484 4475 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073483 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:36.073466844 +0000 UTC m=+20.878365167 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073427 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073524 4475 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073503 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:36.07349687 +0000 UTC m=+20.878395204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073563 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:36.073550081 +0000 UTC m=+20.878448414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.073575 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:36.073569657 +0000 UTC m=+20.878467991 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.490868 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.490963 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.490987 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.491069 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.491212 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:34 crc kubenswrapper[4475]: E1203 06:45:34.491280 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.560531 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f"} Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.569264 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.577913 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.586063 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.594662 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.607585 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.619702 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.633096 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.646234 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.861207 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pcw7j"] Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.861474 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-9b2j8"] Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.861614 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pcw7j" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.861655 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.866764 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.866977 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.867090 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.867309 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.867482 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.867497 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.867695 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.868320 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.879912 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-system-cni-dir\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.879937 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-os-release\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.879956 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-var-lib-cni-multus\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.879970 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-hostroot\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.879992 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8c1979d0-303c-4cf6-9087-3cb2e1aac73b-hosts-file\") pod \"node-resolver-pcw7j\" (UID: \"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\") " pod="openshift-dns/node-resolver-pcw7j" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880031 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-cni-dir\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880071 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f3a17c67-95e0-4889-8a30-64c08b6720f4-cni-binary-copy\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880133 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-socket-dir-parent\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880163 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrf7r\" (UniqueName: \"kubernetes.io/projected/8c1979d0-303c-4cf6-9087-3cb2e1aac73b-kube-api-access-nrf7r\") pod \"node-resolver-pcw7j\" (UID: \"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\") " pod="openshift-dns/node-resolver-pcw7j" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880209 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-cnibin\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880223 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pdk6\" (UniqueName: \"kubernetes.io/projected/f3a17c67-95e0-4889-8a30-64c08b6720f4-kube-api-access-6pdk6\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880282 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-run-k8s-cni-cncf-io\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880314 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-conf-dir\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880336 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-run-netns\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880350 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-var-lib-cni-bin\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880367 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-daemon-config\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880383 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-etc-kubernetes\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880397 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-run-multus-certs\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.880434 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-var-lib-kubelet\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.882439 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.892995 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.904904 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.914767 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.924590 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.933339 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.942715 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.950232 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.957100 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.965386 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.973399 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.980951 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-socket-dir-parent\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.980984 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrf7r\" (UniqueName: \"kubernetes.io/projected/8c1979d0-303c-4cf6-9087-3cb2e1aac73b-kube-api-access-nrf7r\") pod \"node-resolver-pcw7j\" (UID: \"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\") " pod="openshift-dns/node-resolver-pcw7j" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981000 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-cnibin\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981015 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pdk6\" (UniqueName: \"kubernetes.io/projected/f3a17c67-95e0-4889-8a30-64c08b6720f4-kube-api-access-6pdk6\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981040 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-run-k8s-cni-cncf-io\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981056 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-conf-dir\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981073 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-run-netns\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981086 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-var-lib-cni-bin\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981120 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-daemon-config\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981127 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-run-k8s-cni-cncf-io\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981138 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-cnibin\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981166 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-conf-dir\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981213 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-run-netns\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981245 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-socket-dir-parent\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981266 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-var-lib-cni-bin\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981273 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-etc-kubernetes\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981296 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-var-lib-kubelet\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981313 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-run-multus-certs\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981330 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-system-cni-dir\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981344 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-os-release\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981377 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-run-multus-certs\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981380 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-var-lib-kubelet\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981381 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8c1979d0-303c-4cf6-9087-3cb2e1aac73b-hosts-file\") pod \"node-resolver-pcw7j\" (UID: \"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\") " pod="openshift-dns/node-resolver-pcw7j" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981398 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-system-cni-dir\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981412 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8c1979d0-303c-4cf6-9087-3cb2e1aac73b-hosts-file\") pod \"node-resolver-pcw7j\" (UID: \"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\") " pod="openshift-dns/node-resolver-pcw7j" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981416 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-etc-kubernetes\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981428 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-var-lib-cni-multus\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981445 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-host-var-lib-cni-multus\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981447 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-hostroot\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981487 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-cni-dir\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981503 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f3a17c67-95e0-4889-8a30-64c08b6720f4-cni-binary-copy\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981565 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-hostroot\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981617 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-os-release\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.981651 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-cni-dir\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.982292 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f3a17c67-95e0-4889-8a30-64c08b6720f4-multus-daemon-config\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.982382 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f3a17c67-95e0-4889-8a30-64c08b6720f4-cni-binary-copy\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.982595 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.990567 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:34 crc kubenswrapper[4475]: I1203 06:45:34.999909 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pdk6\" (UniqueName: \"kubernetes.io/projected/f3a17c67-95e0-4889-8a30-64c08b6720f4-kube-api-access-6pdk6\") pod \"multus-9b2j8\" (UID: \"f3a17c67-95e0-4889-8a30-64c08b6720f4\") " pod="openshift-multus/multus-9b2j8" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.001781 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrf7r\" (UniqueName: \"kubernetes.io/projected/8c1979d0-303c-4cf6-9087-3cb2e1aac73b-kube-api-access-nrf7r\") pod \"node-resolver-pcw7j\" (UID: \"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\") " pod="openshift-dns/node-resolver-pcw7j" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.010867 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.026003 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.039689 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.052556 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.060344 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.074149 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.087021 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.172206 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pcw7j" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.176149 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9b2j8" Dec 03 06:45:35 crc kubenswrapper[4475]: W1203 06:45:35.185425 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3a17c67_95e0_4889_8a30_64c08b6720f4.slice/crio-70cf24213ce32c0fcc2c7efd74f4b27325c99e398a4c036b90cb1cfb706e2070 WatchSource:0}: Error finding container 70cf24213ce32c0fcc2c7efd74f4b27325c99e398a4c036b90cb1cfb706e2070: Status 404 returned error can't find the container with id 70cf24213ce32c0fcc2c7efd74f4b27325c99e398a4c036b90cb1cfb706e2070 Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.235140 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-k9cmc"] Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.235654 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.237352 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g9t4l"] Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.237937 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-tjbzg"] Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.238052 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.238723 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.238764 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.239233 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.239344 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.239643 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.239819 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.239858 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.240048 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.240096 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.240238 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.240471 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.240539 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.240580 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.242155 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.242416 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.251593 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.263402 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.272694 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.279982 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.282968 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-openvswitch\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.282993 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppdm2\" (UniqueName: \"kubernetes.io/projected/8f42839e-dbc4-445a-a15b-c3aa14813958-kube-api-access-ppdm2\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283011 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283025 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-bin\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283043 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-netns\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283064 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-system-cni-dir\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283136 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-slash\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283157 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-netd\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283215 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283231 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f42839e-dbc4-445a-a15b-c3aa14813958-ovn-node-metrics-cert\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283284 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-script-lib\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283299 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/91aee7be-4a52-4598-803f-2deebe0674de-rootfs\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283313 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-os-release\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283360 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-var-lib-openvswitch\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283374 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-node-log\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283386 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-env-overrides\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283398 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91aee7be-4a52-4598-803f-2deebe0674de-proxy-tls\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283438 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-kubelet\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283486 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvqvg\" (UniqueName: \"kubernetes.io/projected/91aee7be-4a52-4598-803f-2deebe0674de-kube-api-access-xvqvg\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283500 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7168f008-1b03-40cf-94fa-a71d470454bf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283513 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6bbq\" (UniqueName: \"kubernetes.io/projected/7168f008-1b03-40cf-94fa-a71d470454bf-kube-api-access-s6bbq\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283561 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283575 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/91aee7be-4a52-4598-803f-2deebe0674de-mcd-auth-proxy-config\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283588 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7168f008-1b03-40cf-94fa-a71d470454bf-cni-binary-copy\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283607 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-ovn\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283646 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-log-socket\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283660 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-etc-openvswitch\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283673 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-systemd\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283755 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-config\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283772 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-cnibin\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.283830 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-systemd-units\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.288430 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.296023 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.305795 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.316204 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.326349 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.336222 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.344055 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.352193 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.361724 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.375211 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.383691 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.384847 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-var-lib-openvswitch\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.384876 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91aee7be-4a52-4598-803f-2deebe0674de-proxy-tls\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.384893 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-kubelet\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.384908 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-node-log\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.384922 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-env-overrides\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.384936 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvqvg\" (UniqueName: \"kubernetes.io/projected/91aee7be-4a52-4598-803f-2deebe0674de-kube-api-access-xvqvg\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.384950 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7168f008-1b03-40cf-94fa-a71d470454bf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.384973 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.384988 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6bbq\" (UniqueName: \"kubernetes.io/projected/7168f008-1b03-40cf-94fa-a71d470454bf-kube-api-access-s6bbq\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385008 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-ovn\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385021 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-log-socket\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385034 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/91aee7be-4a52-4598-803f-2deebe0674de-mcd-auth-proxy-config\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385048 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7168f008-1b03-40cf-94fa-a71d470454bf-cni-binary-copy\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385061 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-etc-openvswitch\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385087 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-systemd\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385102 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-config\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385124 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-var-lib-openvswitch\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385133 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-cnibin\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385158 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-cnibin\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385168 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-systemd-units\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385181 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-kubelet\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385195 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-openvswitch\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385200 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-node-log\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385209 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppdm2\" (UniqueName: \"kubernetes.io/projected/8f42839e-dbc4-445a-a15b-c3aa14813958-kube-api-access-ppdm2\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385224 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-netns\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385238 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385252 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-bin\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385265 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-slash\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385280 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-netd\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385293 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-system-cni-dir\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385308 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385321 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f42839e-dbc4-445a-a15b-c3aa14813958-ovn-node-metrics-cert\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385336 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-script-lib\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385349 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/91aee7be-4a52-4598-803f-2deebe0674de-rootfs\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385364 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-os-release\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385416 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-os-release\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385438 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-systemd-units\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385474 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-openvswitch\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385635 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-netns\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385650 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-env-overrides\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385658 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385675 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-bin\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385692 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-slash\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385709 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-netd\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385727 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-system-cni-dir\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.385772 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7168f008-1b03-40cf-94fa-a71d470454bf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.386045 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-ovn\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.386058 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-etc-openvswitch\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.386081 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-log-socket\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.386085 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-systemd\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.386233 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7168f008-1b03-40cf-94fa-a71d470454bf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.386271 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/91aee7be-4a52-4598-803f-2deebe0674de-rootfs\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.386272 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.386578 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-config\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.386579 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7168f008-1b03-40cf-94fa-a71d470454bf-cni-binary-copy\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.386611 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/91aee7be-4a52-4598-803f-2deebe0674de-mcd-auth-proxy-config\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.386660 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-script-lib\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.388593 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f42839e-dbc4-445a-a15b-c3aa14813958-ovn-node-metrics-cert\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.389741 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91aee7be-4a52-4598-803f-2deebe0674de-proxy-tls\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.399247 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppdm2\" (UniqueName: \"kubernetes.io/projected/8f42839e-dbc4-445a-a15b-c3aa14813958-kube-api-access-ppdm2\") pod \"ovnkube-node-g9t4l\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.399286 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6bbq\" (UniqueName: \"kubernetes.io/projected/7168f008-1b03-40cf-94fa-a71d470454bf-kube-api-access-s6bbq\") pod \"multus-additional-cni-plugins-k9cmc\" (UID: \"7168f008-1b03-40cf-94fa-a71d470454bf\") " pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.400394 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvqvg\" (UniqueName: \"kubernetes.io/projected/91aee7be-4a52-4598-803f-2deebe0674de-kube-api-access-xvqvg\") pod \"machine-config-daemon-tjbzg\" (UID: \"91aee7be-4a52-4598-803f-2deebe0674de\") " pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.402845 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.410475 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.418556 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.426157 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.436428 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.445303 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.454697 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.461826 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.462769 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.471105 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.471863 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.472846 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.481296 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.494208 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.502462 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.510746 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.518727 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.526406 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.540144 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.550313 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.556018 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.560061 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.560509 4475 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:45:35 crc kubenswrapper[4475]: W1203 06:45:35.561344 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7168f008_1b03_40cf_94fa_a71d470454bf.slice/crio-04e36bf79a39757ef034e665a5ac8636a89e5e1dfe572d1ee2f5853814d46fca WatchSource:0}: Error finding container 04e36bf79a39757ef034e665a5ac8636a89e5e1dfe572d1ee2f5853814d46fca: Status 404 returned error can't find the container with id 04e36bf79a39757ef034e665a5ac8636a89e5e1dfe572d1ee2f5853814d46fca Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.564904 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.567441 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.567478 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.567487 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.567573 4475 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.567807 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pcw7j" event={"ID":"8c1979d0-303c-4cf6-9087-3cb2e1aac73b","Type":"ContainerStarted","Data":"eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1"} Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.567849 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pcw7j" event={"ID":"8c1979d0-303c-4cf6-9087-3cb2e1aac73b","Type":"ContainerStarted","Data":"3bbacde6b35e722b1ea1d7fe1b2afa4ab1f0fd340b97fcbafe10c45103f4d536"} Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.572920 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: W1203 06:45:35.579388 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f42839e_dbc4_445a_a15b_c3aa14813958.slice/crio-b4b141100ea052faea009e86b4836d44db60d742453d01254879de450e50a718 WatchSource:0}: Error finding container b4b141100ea052faea009e86b4836d44db60d742453d01254879de450e50a718: Status 404 returned error can't find the container with id b4b141100ea052faea009e86b4836d44db60d742453d01254879de450e50a718 Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.581333 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9b2j8" event={"ID":"f3a17c67-95e0-4889-8a30-64c08b6720f4","Type":"ContainerStarted","Data":"d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca"} Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.581370 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9b2j8" event={"ID":"f3a17c67-95e0-4889-8a30-64c08b6720f4","Type":"ContainerStarted","Data":"70cf24213ce32c0fcc2c7efd74f4b27325c99e398a4c036b90cb1cfb706e2070"} Dec 03 06:45:35 crc kubenswrapper[4475]: W1203 06:45:35.587328 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91aee7be_4a52_4598_803f_2deebe0674de.slice/crio-d98f25bd81a27453176faf52e7fa9bb2833bb61bde5650f095f2e7549cc89c7d WatchSource:0}: Error finding container d98f25bd81a27453176faf52e7fa9bb2833bb61bde5650f095f2e7549cc89c7d: Status 404 returned error can't find the container with id d98f25bd81a27453176faf52e7fa9bb2833bb61bde5650f095f2e7549cc89c7d Dec 03 06:45:35 crc kubenswrapper[4475]: E1203 06:45:35.628076 4475 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.642927 4475 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.643124 4475 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.643882 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.643911 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.643919 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.643933 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.643942 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:35Z","lastTransitionTime":"2025-12-03T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:35 crc kubenswrapper[4475]: E1203 06:45:35.668321 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.670331 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.671865 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.671888 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.671896 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.671909 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.671917 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:35Z","lastTransitionTime":"2025-12-03T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:35 crc kubenswrapper[4475]: E1203 06:45:35.680566 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.682951 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.682978 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.682987 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.682999 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.683008 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:35Z","lastTransitionTime":"2025-12-03T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:35 crc kubenswrapper[4475]: E1203 06:45:35.692240 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.694528 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.694557 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.694567 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.694580 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.694588 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:35Z","lastTransitionTime":"2025-12-03T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:35 crc kubenswrapper[4475]: E1203 06:45:35.702352 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.706762 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.706785 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.706794 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.706804 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.706811 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:35Z","lastTransitionTime":"2025-12-03T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.707983 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: E1203 06:45:35.715019 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: E1203 06:45:35.715124 4475 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.716105 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.716296 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.716305 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.716314 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.716321 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:35Z","lastTransitionTime":"2025-12-03T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.748801 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.788411 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.818344 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.818376 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.818384 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.818397 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.818405 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:35Z","lastTransitionTime":"2025-12-03T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.834189 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.870590 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.907563 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.919745 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.919770 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.919780 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.919792 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.919801 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:35Z","lastTransitionTime":"2025-12-03T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.948974 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:35 crc kubenswrapper[4475]: I1203 06:45:35.988404 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.021391 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.021424 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.021435 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.021466 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.021475 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:36Z","lastTransitionTime":"2025-12-03T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.032337 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.068750 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.090239 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.090333 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090362 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:45:40.090348494 +0000 UTC m=+24.895246828 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.090385 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090390 4475 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.090409 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090418 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:40.090411112 +0000 UTC m=+24.895309445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.090428 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090518 4475 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090543 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:40.090535727 +0000 UTC m=+24.895434061 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090545 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090572 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090584 4475 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090592 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090603 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090612 4475 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090632 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:40.090615939 +0000 UTC m=+24.895514272 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.090649 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:40.090642879 +0000 UTC m=+24.895541214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.109975 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.123596 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.123705 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.123769 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.123830 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.123881 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:36Z","lastTransitionTime":"2025-12-03T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.148437 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.194395 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.225321 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.225354 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.225363 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.225376 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.225385 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:36Z","lastTransitionTime":"2025-12-03T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.228358 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.268745 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.308092 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.326927 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.326946 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.326954 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.326965 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.326973 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:36Z","lastTransitionTime":"2025-12-03T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.348201 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.391422 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.429032 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.430289 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.430377 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.430432 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.430501 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.430550 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:36Z","lastTransitionTime":"2025-12-03T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.469597 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.490405 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.490505 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.490554 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.490666 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.490560 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:36 crc kubenswrapper[4475]: E1203 06:45:36.490782 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.508701 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.532810 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.532919 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.532976 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.533030 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.533105 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:36Z","lastTransitionTime":"2025-12-03T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.548833 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.584245 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b" exitCode=0 Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.584317 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.584359 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"b4b141100ea052faea009e86b4836d44db60d742453d01254879de450e50a718"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.586835 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.586957 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.587016 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"d98f25bd81a27453176faf52e7fa9bb2833bb61bde5650f095f2e7549cc89c7d"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.587864 4475 generic.go:334] "Generic (PLEG): container finished" podID="7168f008-1b03-40cf-94fa-a71d470454bf" containerID="31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc" exitCode=0 Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.587990 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" event={"ID":"7168f008-1b03-40cf-94fa-a71d470454bf","Type":"ContainerDied","Data":"31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.588066 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" event={"ID":"7168f008-1b03-40cf-94fa-a71d470454bf","Type":"ContainerStarted","Data":"04e36bf79a39757ef034e665a5ac8636a89e5e1dfe572d1ee2f5853814d46fca"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.590425 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.628610 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.635432 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.635578 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.635588 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.635600 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.635608 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:36Z","lastTransitionTime":"2025-12-03T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.680166 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.707846 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.737329 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.737352 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.737360 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.737371 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.737379 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:36Z","lastTransitionTime":"2025-12-03T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.749859 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.786750 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.829471 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.839059 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.839092 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.839102 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.839115 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.839133 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:36Z","lastTransitionTime":"2025-12-03T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.873011 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.908570 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.941014 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.941038 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.941046 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.941058 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.941066 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:36Z","lastTransitionTime":"2025-12-03T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.947605 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:36 crc kubenswrapper[4475]: I1203 06:45:36.988208 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.027731 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.043431 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.043469 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.043477 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.043492 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.043500 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:37Z","lastTransitionTime":"2025-12-03T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.067699 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.106013 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.144883 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.144909 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.144917 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.144928 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.144936 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:37Z","lastTransitionTime":"2025-12-03T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.153104 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.185904 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.227882 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.246745 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.246769 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.246778 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.246789 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.246798 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:37Z","lastTransitionTime":"2025-12-03T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.271273 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.308258 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.347717 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.348416 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.348442 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.348469 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.348481 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.348488 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:37Z","lastTransitionTime":"2025-12-03T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.387893 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.427875 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.450724 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.450756 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.450765 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.450778 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.450786 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:37Z","lastTransitionTime":"2025-12-03T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.467292 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.508141 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.552868 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.552892 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.552902 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.552913 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.552922 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:37Z","lastTransitionTime":"2025-12-03T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.590863 4475 generic.go:334] "Generic (PLEG): container finished" podID="7168f008-1b03-40cf-94fa-a71d470454bf" containerID="9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d" exitCode=0 Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.590906 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" event={"ID":"7168f008-1b03-40cf-94fa-a71d470454bf","Type":"ContainerDied","Data":"9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.595499 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.595521 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.595532 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.595540 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.595547 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.595555 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.602248 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.614307 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.627715 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.654471 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.654498 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.654506 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.654517 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.654526 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:37Z","lastTransitionTime":"2025-12-03T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.670421 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.707259 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.750134 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.756422 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.756468 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.756477 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.756490 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.756498 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:37Z","lastTransitionTime":"2025-12-03T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.792212 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.829497 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.858543 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.858567 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.858576 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.858587 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.858594 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:37Z","lastTransitionTime":"2025-12-03T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.866791 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.907659 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.948012 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.960268 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.960297 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.960307 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.960319 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.960328 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:37Z","lastTransitionTime":"2025-12-03T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:37 crc kubenswrapper[4475]: I1203 06:45:37.986924 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.028710 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.062099 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.062194 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.062251 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.062303 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.062371 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:38Z","lastTransitionTime":"2025-12-03T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.073017 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.116174 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-dqbgx"] Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.116495 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dqbgx" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.119353 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.121109 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.121377 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.162732 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.163751 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.163781 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.163794 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.163807 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.163815 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:38Z","lastTransitionTime":"2025-12-03T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.192590 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.208301 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ef36226-4b8b-4a7b-a87f-daa9dda6e70b-serviceca\") pod \"node-ca-dqbgx\" (UID: \"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\") " pod="openshift-image-registry/node-ca-dqbgx" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.208344 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjjp4\" (UniqueName: \"kubernetes.io/projected/9ef36226-4b8b-4a7b-a87f-daa9dda6e70b-kube-api-access-wjjp4\") pod \"node-ca-dqbgx\" (UID: \"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\") " pod="openshift-image-registry/node-ca-dqbgx" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.208370 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ef36226-4b8b-4a7b-a87f-daa9dda6e70b-host\") pod \"node-ca-dqbgx\" (UID: \"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\") " pod="openshift-image-registry/node-ca-dqbgx" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.226527 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.265295 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.265324 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.265332 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.265346 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.265355 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:38Z","lastTransitionTime":"2025-12-03T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.272433 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.308352 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.308735 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ef36226-4b8b-4a7b-a87f-daa9dda6e70b-serviceca\") pod \"node-ca-dqbgx\" (UID: \"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\") " pod="openshift-image-registry/node-ca-dqbgx" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.308826 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjjp4\" (UniqueName: \"kubernetes.io/projected/9ef36226-4b8b-4a7b-a87f-daa9dda6e70b-kube-api-access-wjjp4\") pod \"node-ca-dqbgx\" (UID: \"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\") " pod="openshift-image-registry/node-ca-dqbgx" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.308904 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ef36226-4b8b-4a7b-a87f-daa9dda6e70b-host\") pod \"node-ca-dqbgx\" (UID: \"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\") " pod="openshift-image-registry/node-ca-dqbgx" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.308977 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ef36226-4b8b-4a7b-a87f-daa9dda6e70b-host\") pod \"node-ca-dqbgx\" (UID: \"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\") " pod="openshift-image-registry/node-ca-dqbgx" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.309665 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ef36226-4b8b-4a7b-a87f-daa9dda6e70b-serviceca\") pod \"node-ca-dqbgx\" (UID: \"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\") " pod="openshift-image-registry/node-ca-dqbgx" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.354771 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjjp4\" (UniqueName: \"kubernetes.io/projected/9ef36226-4b8b-4a7b-a87f-daa9dda6e70b-kube-api-access-wjjp4\") pod \"node-ca-dqbgx\" (UID: \"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\") " pod="openshift-image-registry/node-ca-dqbgx" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.366953 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.367037 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.367091 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.367169 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.367254 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:38Z","lastTransitionTime":"2025-12-03T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.368592 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.406778 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.425463 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dqbgx" Dec 03 06:45:38 crc kubenswrapper[4475]: W1203 06:45:38.434589 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ef36226_4b8b_4a7b_a87f_daa9dda6e70b.slice/crio-ad4b847b1c499bbfa872b6c8863172761e462608eef774b0750c240a2e6ad101 WatchSource:0}: Error finding container ad4b847b1c499bbfa872b6c8863172761e462608eef774b0750c240a2e6ad101: Status 404 returned error can't find the container with id ad4b847b1c499bbfa872b6c8863172761e462608eef774b0750c240a2e6ad101 Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.448962 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.468653 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.468689 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.468699 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.468711 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.468719 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:38Z","lastTransitionTime":"2025-12-03T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.485844 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.491026 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.491055 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.491084 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:38 crc kubenswrapper[4475]: E1203 06:45:38.491108 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:38 crc kubenswrapper[4475]: E1203 06:45:38.491190 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:38 crc kubenswrapper[4475]: E1203 06:45:38.491222 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.528236 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.568307 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.569841 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.569860 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.569868 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.569879 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.569886 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:38Z","lastTransitionTime":"2025-12-03T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.598716 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dqbgx" event={"ID":"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b","Type":"ContainerStarted","Data":"5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.598745 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dqbgx" event={"ID":"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b","Type":"ContainerStarted","Data":"ad4b847b1c499bbfa872b6c8863172761e462608eef774b0750c240a2e6ad101"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.600100 4475 generic.go:334] "Generic (PLEG): container finished" podID="7168f008-1b03-40cf-94fa-a71d470454bf" containerID="742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667" exitCode=0 Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.600125 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" event={"ID":"7168f008-1b03-40cf-94fa-a71d470454bf","Type":"ContainerDied","Data":"742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.608226 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.648718 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.671530 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.671561 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.671570 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.671581 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.671591 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:38Z","lastTransitionTime":"2025-12-03T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.687983 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.733441 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.768890 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.773054 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.773074 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.773082 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.773094 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.773102 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:38Z","lastTransitionTime":"2025-12-03T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.812262 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.846208 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.874846 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.874877 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.874886 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.874900 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.874908 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:38Z","lastTransitionTime":"2025-12-03T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.887483 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.927897 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.968042 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.976487 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.976511 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.976519 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.976532 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:38 crc kubenswrapper[4475]: I1203 06:45:38.976539 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:38Z","lastTransitionTime":"2025-12-03T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.007820 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.046839 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.078484 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.078507 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.078514 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.078525 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.078533 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:39Z","lastTransitionTime":"2025-12-03T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.087201 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.131754 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.169290 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.180545 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.180569 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.180578 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.180589 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.180597 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:39Z","lastTransitionTime":"2025-12-03T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.207584 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.247803 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.282016 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.282039 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.282047 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.282059 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.282066 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:39Z","lastTransitionTime":"2025-12-03T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.287361 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.326551 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.368216 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.383192 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.383215 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.383223 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.383235 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.383243 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:39Z","lastTransitionTime":"2025-12-03T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.484861 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.484884 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.484894 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.484905 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.484912 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:39Z","lastTransitionTime":"2025-12-03T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.586413 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.586445 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.586470 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.586484 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.586492 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:39Z","lastTransitionTime":"2025-12-03T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.605607 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.607568 4475 generic.go:334] "Generic (PLEG): container finished" podID="7168f008-1b03-40cf-94fa-a71d470454bf" containerID="2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450" exitCode=0 Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.607601 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" event={"ID":"7168f008-1b03-40cf-94fa-a71d470454bf","Type":"ContainerDied","Data":"2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.617437 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.631929 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.640913 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.650016 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.658991 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.666754 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.681505 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.688258 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.689053 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.689080 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.689088 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.689099 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.689109 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:39Z","lastTransitionTime":"2025-12-03T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.728486 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.767905 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.790970 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.790996 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.791004 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.791016 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.791025 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:39Z","lastTransitionTime":"2025-12-03T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.808004 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.849259 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.889282 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.892787 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.892808 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.892816 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.892829 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.892839 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:39Z","lastTransitionTime":"2025-12-03T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.927900 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.970363 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.994496 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.994518 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.994526 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.994538 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:39 crc kubenswrapper[4475]: I1203 06:45:39.994551 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:39Z","lastTransitionTime":"2025-12-03T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.096513 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.096535 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.096544 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.096554 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.096562 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:40Z","lastTransitionTime":"2025-12-03T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.123278 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.123322 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.123346 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.123365 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.123381 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123470 4475 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123507 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:48.123497304 +0000 UTC m=+32.928395639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123730 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:45:48.123720927 +0000 UTC m=+32.928619261 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123767 4475 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123788 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:48.123782222 +0000 UTC m=+32.928680557 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123831 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123841 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123849 4475 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123867 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:48.123861642 +0000 UTC m=+32.928759977 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123900 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123907 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123915 4475 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.123930 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:48.123925573 +0000 UTC m=+32.928823907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.197639 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.197659 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.197667 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.197677 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.197686 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:40Z","lastTransitionTime":"2025-12-03T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.298891 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.299033 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.299041 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.299052 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.299059 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:40Z","lastTransitionTime":"2025-12-03T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.401309 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.401346 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.401355 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.401367 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.401376 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:40Z","lastTransitionTime":"2025-12-03T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.490321 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.490345 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.490350 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.490412 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.490551 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:40 crc kubenswrapper[4475]: E1203 06:45:40.490606 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.502877 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.502910 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.502919 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.502933 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.502943 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:40Z","lastTransitionTime":"2025-12-03T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.604130 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.604172 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.604181 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.604194 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.604202 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:40Z","lastTransitionTime":"2025-12-03T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.612528 4475 generic.go:334] "Generic (PLEG): container finished" podID="7168f008-1b03-40cf-94fa-a71d470454bf" containerID="d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880" exitCode=0 Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.612560 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" event={"ID":"7168f008-1b03-40cf-94fa-a71d470454bf","Type":"ContainerDied","Data":"d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880"} Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.625679 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.636516 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.646517 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.655077 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.669847 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.677558 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.685775 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.692379 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.704466 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.706602 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.706627 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.706635 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.706646 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.706654 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:40Z","lastTransitionTime":"2025-12-03T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.718763 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.728309 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.737490 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.745641 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.753504 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.762863 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:40Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.808045 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.808070 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.808078 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.808089 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.808097 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:40Z","lastTransitionTime":"2025-12-03T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.909738 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.909865 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.909948 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.910022 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:40 crc kubenswrapper[4475]: I1203 06:45:40.910095 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:40Z","lastTransitionTime":"2025-12-03T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.011129 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.011171 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.011181 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.011194 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.011202 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:41Z","lastTransitionTime":"2025-12-03T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.113486 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.113678 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.113687 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.113702 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.113711 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:41Z","lastTransitionTime":"2025-12-03T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.215022 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.215057 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.215066 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.215078 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.215092 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:41Z","lastTransitionTime":"2025-12-03T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.317390 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.317417 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.317425 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.317437 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.317447 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:41Z","lastTransitionTime":"2025-12-03T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.419172 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.419215 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.419227 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.419239 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.419246 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:41Z","lastTransitionTime":"2025-12-03T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.521356 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.521384 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.521393 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.521403 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.521412 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:41Z","lastTransitionTime":"2025-12-03T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.616945 4475 generic.go:334] "Generic (PLEG): container finished" podID="7168f008-1b03-40cf-94fa-a71d470454bf" containerID="ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a" exitCode=0 Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.616996 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" event={"ID":"7168f008-1b03-40cf-94fa-a71d470454bf","Type":"ContainerDied","Data":"ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.621070 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"7010a29447a39602bc7aa1a509917e3b938306ca72c783d4dbd7f1b2b1388934"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.621314 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.621355 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.623519 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.623546 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.623555 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.623568 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.623576 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:41Z","lastTransitionTime":"2025-12-03T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.628870 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.639836 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.639882 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.641610 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.653325 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.661705 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.672331 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.679353 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.692303 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.699035 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.706755 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.714554 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.722838 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.726324 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.726347 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.726355 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.726379 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.726387 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:41Z","lastTransitionTime":"2025-12-03T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.734534 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.741528 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.749084 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.761729 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.769899 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.777665 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.785765 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.793037 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.808951 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.815139 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.823565 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.828626 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.828656 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.828664 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.828678 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.828686 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:41Z","lastTransitionTime":"2025-12-03T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.835248 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7010a29447a39602bc7aa1a509917e3b938306ca72c783d4dbd7f1b2b1388934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.843600 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.851178 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.858532 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.865712 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.872082 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.880239 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.888789 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:41Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.930194 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.930220 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.930228 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.930239 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:41 crc kubenswrapper[4475]: I1203 06:45:41.930247 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:41Z","lastTransitionTime":"2025-12-03T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.032293 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.032326 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.032334 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.032347 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.032356 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:42Z","lastTransitionTime":"2025-12-03T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.133930 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.133960 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.133969 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.133981 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.133990 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:42Z","lastTransitionTime":"2025-12-03T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.235593 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.235626 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.235634 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.235648 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.235658 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:42Z","lastTransitionTime":"2025-12-03T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.337560 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.337662 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.337741 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.337807 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.337863 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:42Z","lastTransitionTime":"2025-12-03T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.439498 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.439531 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.439539 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.439553 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.439561 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:42Z","lastTransitionTime":"2025-12-03T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.490527 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.490539 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:42 crc kubenswrapper[4475]: E1203 06:45:42.490629 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:42 crc kubenswrapper[4475]: E1203 06:45:42.490687 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.490541 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:42 crc kubenswrapper[4475]: E1203 06:45:42.490936 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.541478 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.541512 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.541523 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.541536 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.541543 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:42Z","lastTransitionTime":"2025-12-03T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.626141 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" event={"ID":"7168f008-1b03-40cf-94fa-a71d470454bf","Type":"ContainerStarted","Data":"625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de"} Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.626204 4475 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.636779 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.643573 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.643599 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.643608 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.643620 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.643628 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:42Z","lastTransitionTime":"2025-12-03T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.646638 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.660920 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.670109 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.683539 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.689915 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.698318 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.710339 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7010a29447a39602bc7aa1a509917e3b938306ca72c783d4dbd7f1b2b1388934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.718970 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.726832 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.734789 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.742531 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.744946 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.744978 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.744987 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.744999 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.745011 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:42Z","lastTransitionTime":"2025-12-03T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.750210 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.758258 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.767730 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.846337 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.846359 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.846369 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.846381 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.846389 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:42Z","lastTransitionTime":"2025-12-03T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.948425 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.948445 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.948476 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.948488 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:42 crc kubenswrapper[4475]: I1203 06:45:42.948496 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:42Z","lastTransitionTime":"2025-12-03T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.050199 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.050235 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.050246 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.050259 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.050547 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:43Z","lastTransitionTime":"2025-12-03T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.152632 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.152659 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.152667 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.152679 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.152687 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:43Z","lastTransitionTime":"2025-12-03T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.254505 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.254527 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.254536 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.254546 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.254554 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:43Z","lastTransitionTime":"2025-12-03T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.356494 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.356515 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.356523 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.356534 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.356542 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:43Z","lastTransitionTime":"2025-12-03T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.457751 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.457784 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.457795 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.457807 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.457815 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:43Z","lastTransitionTime":"2025-12-03T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.559504 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.559699 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.559709 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.559723 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.559733 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:43Z","lastTransitionTime":"2025-12-03T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.629569 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/0.log" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.631435 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="7010a29447a39602bc7aa1a509917e3b938306ca72c783d4dbd7f1b2b1388934" exitCode=1 Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.631479 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"7010a29447a39602bc7aa1a509917e3b938306ca72c783d4dbd7f1b2b1388934"} Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.631977 4475 scope.go:117] "RemoveContainer" containerID="7010a29447a39602bc7aa1a509917e3b938306ca72c783d4dbd7f1b2b1388934" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.642385 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.654808 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.661082 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.661216 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.661275 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.661340 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.661391 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:43Z","lastTransitionTime":"2025-12-03T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.663400 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.673129 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.682552 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.691262 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.697954 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.711214 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.720035 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.727595 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.736975 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.743497 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.751464 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.763560 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.763607 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.763616 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.763627 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.763634 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:43Z","lastTransitionTime":"2025-12-03T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.765106 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7010a29447a39602bc7aa1a509917e3b938306ca72c783d4dbd7f1b2b1388934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7010a29447a39602bc7aa1a509917e3b938306ca72c783d4dbd7f1b2b1388934\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"message\\\":\\\"6:45:42.943208 5677 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:45:42.943215 5677 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:45:42.943221 5677 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:45:42.943227 5677 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:45:42.943237 5677 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:45:42.943249 5677 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:45:42.943278 5677 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:45:42.943285 5677 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:45:42.943284 5677 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:45:42.943289 5677 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:45:42.943259 5677 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:45:42.943305 5677 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:45:42.943329 5677 factory.go:656] Stopping watch factory\\\\nI1203 06:45:42.943341 5677 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:45:42.943360 5677 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:45:42.943361 5677 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.774190 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.865633 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.865660 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.865668 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.865680 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.865688 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:43Z","lastTransitionTime":"2025-12-03T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.967417 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.967477 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.967487 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.967502 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:43 crc kubenswrapper[4475]: I1203 06:45:43.967511 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:43Z","lastTransitionTime":"2025-12-03T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.069630 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.069659 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.069667 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.069679 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.069688 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:44Z","lastTransitionTime":"2025-12-03T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.171888 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.171921 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.171931 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.171944 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.171955 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:44Z","lastTransitionTime":"2025-12-03T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.273620 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.273653 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.273662 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.273676 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.273685 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:44Z","lastTransitionTime":"2025-12-03T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.375425 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.375488 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.375498 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.375510 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.375518 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:44Z","lastTransitionTime":"2025-12-03T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.477316 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.477348 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.477357 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.477369 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.477377 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:44Z","lastTransitionTime":"2025-12-03T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.490560 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.490579 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.490601 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:44 crc kubenswrapper[4475]: E1203 06:45:44.490652 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:44 crc kubenswrapper[4475]: E1203 06:45:44.490717 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:44 crc kubenswrapper[4475]: E1203 06:45:44.490785 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.578849 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.578885 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.578893 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.578905 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.578913 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:44Z","lastTransitionTime":"2025-12-03T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.635331 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/1.log" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.635846 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/0.log" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.638141 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c" exitCode=1 Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.638177 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c"} Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.638235 4475 scope.go:117] "RemoveContainer" containerID="7010a29447a39602bc7aa1a509917e3b938306ca72c783d4dbd7f1b2b1388934" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.638624 4475 scope.go:117] "RemoveContainer" containerID="6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c" Dec 03 06:45:44 crc kubenswrapper[4475]: E1203 06:45:44.638744 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.647257 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.656298 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.668969 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7010a29447a39602bc7aa1a509917e3b938306ca72c783d4dbd7f1b2b1388934\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"message\\\":\\\"6:45:42.943208 5677 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:45:42.943215 5677 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:45:42.943221 5677 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:45:42.943227 5677 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:45:42.943237 5677 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:45:42.943249 5677 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:45:42.943278 5677 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:45:42.943285 5677 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:45:42.943284 5677 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:45:42.943289 5677 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:45:42.943259 5677 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:45:42.943305 5677 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:45:42.943329 5677 factory.go:656] Stopping watch factory\\\\nI1203 06:45:42.943341 5677 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:45:42.943360 5677 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:45:42.943361 5677 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:44Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:45:44.234950 5806 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.677608 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.680153 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.680193 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.680204 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.680216 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.680224 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:44Z","lastTransitionTime":"2025-12-03T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.685868 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.693932 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.701969 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.709753 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.719343 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.727831 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.736213 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.743937 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.751304 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.764129 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.770951 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.782153 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.782178 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.782197 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.782209 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.782216 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:44Z","lastTransitionTime":"2025-12-03T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.883721 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.883743 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.883752 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.883762 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.883770 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:44Z","lastTransitionTime":"2025-12-03T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.986277 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.986309 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.986319 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.986332 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:44 crc kubenswrapper[4475]: I1203 06:45:44.986346 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:44Z","lastTransitionTime":"2025-12-03T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.088430 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.088470 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.088479 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.088491 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.088499 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:45Z","lastTransitionTime":"2025-12-03T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.189987 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.190017 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.190026 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.190040 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.190048 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:45Z","lastTransitionTime":"2025-12-03T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.291550 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.291601 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.291610 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.291624 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.291631 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:45Z","lastTransitionTime":"2025-12-03T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.393896 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.393925 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.393933 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.393944 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.393954 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:45Z","lastTransitionTime":"2025-12-03T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.495222 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.495255 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.495263 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.495282 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.495290 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:45Z","lastTransitionTime":"2025-12-03T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.501662 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.510967 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.520792 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.528821 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.536740 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.552007 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.567224 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.575521 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.583691 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.592834 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.597239 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.597268 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.597276 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.597287 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.597296 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:45Z","lastTransitionTime":"2025-12-03T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.600705 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.608513 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.615642 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.624214 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.636598 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7010a29447a39602bc7aa1a509917e3b938306ca72c783d4dbd7f1b2b1388934\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"message\\\":\\\"6:45:42.943208 5677 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:45:42.943215 5677 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:45:42.943221 5677 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:45:42.943227 5677 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:45:42.943237 5677 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:45:42.943249 5677 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:45:42.943278 5677 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:45:42.943285 5677 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:45:42.943284 5677 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:45:42.943289 5677 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:45:42.943259 5677 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:45:42.943305 5677 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:45:42.943329 5677 factory.go:656] Stopping watch factory\\\\nI1203 06:45:42.943341 5677 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:45:42.943360 5677 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:45:42.943361 5677 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:44Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:45:44.234950 5806 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.642124 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/1.log" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.644908 4475 scope.go:117] "RemoveContainer" containerID="6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c" Dec 03 06:45:45 crc kubenswrapper[4475]: E1203 06:45:45.645080 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.653412 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.661616 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.669263 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.677026 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.689376 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.695951 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.698801 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.698831 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.698841 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.698854 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.698863 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:45Z","lastTransitionTime":"2025-12-03T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.704727 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.716531 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:44Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:45:44.234950 5806 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.725415 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.732852 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.739930 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.747230 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.753107 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.760926 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.770378 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.801027 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.801055 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.801065 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.801076 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.801088 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:45Z","lastTransitionTime":"2025-12-03T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.902545 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.902584 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.902592 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.902607 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.902616 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:45Z","lastTransitionTime":"2025-12-03T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.985298 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.985352 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.985361 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.985376 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.985384 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:45Z","lastTransitionTime":"2025-12-03T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:45 crc kubenswrapper[4475]: E1203 06:45:45.993543 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:45Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.996494 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.996520 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.996527 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.996540 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:45 crc kubenswrapper[4475]: I1203 06:45:45.996548 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:45Z","lastTransitionTime":"2025-12-03T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: E1203 06:45:46.004409 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.006462 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.006484 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.006491 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.006500 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.006507 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: E1203 06:45:46.014623 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.017066 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.017117 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.017128 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.017136 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.017143 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: E1203 06:45:46.024842 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.026770 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.026798 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.026807 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.026818 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.026825 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: E1203 06:45:46.034633 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: E1203 06:45:46.034769 4475 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.035752 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.035781 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.035790 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.035803 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.035812 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.137253 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.137285 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.137295 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.137310 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.137319 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.216892 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5"] Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.217225 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.218510 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.220855 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.227167 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.238721 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.238743 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.238752 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.238761 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.238769 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.239220 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:44Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:45:44.234950 5806 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.247249 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.254866 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.262317 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.269965 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.272779 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.272878 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65wzb\" (UniqueName: \"kubernetes.io/projected/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-kube-api-access-65wzb\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.272961 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.273051 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.276407 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.283974 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.293102 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.301563 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.309299 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.316651 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.324495 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.331973 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.340490 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.340516 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.340524 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.340535 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.340543 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.343838 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.350046 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.373601 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65wzb\" (UniqueName: \"kubernetes.io/projected/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-kube-api-access-65wzb\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.373638 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.373664 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.373701 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.374221 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.374277 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.377600 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.385396 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65wzb\" (UniqueName: \"kubernetes.io/projected/b1df0a77-f3cc-49ab-9fbb-8a4c7608291b-kube-api-access-65wzb\") pod \"ovnkube-control-plane-749d76644c-sbkp5\" (UID: \"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.442540 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.442601 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.442611 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.442623 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.442631 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.490585 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.490677 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.490655 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:46 crc kubenswrapper[4475]: E1203 06:45:46.490861 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:46 crc kubenswrapper[4475]: E1203 06:45:46.490924 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:46 crc kubenswrapper[4475]: E1203 06:45:46.491074 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.526495 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" Dec 03 06:45:46 crc kubenswrapper[4475]: W1203 06:45:46.539644 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1df0a77_f3cc_49ab_9fbb_8a4c7608291b.slice/crio-43d51f53db46cd4c2369d71b11a391317a14aac1292bbf541a3b85b9a2b7ba60 WatchSource:0}: Error finding container 43d51f53db46cd4c2369d71b11a391317a14aac1292bbf541a3b85b9a2b7ba60: Status 404 returned error can't find the container with id 43d51f53db46cd4c2369d71b11a391317a14aac1292bbf541a3b85b9a2b7ba60 Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.546152 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.546179 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.546188 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.546209 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.546218 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.647536 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.647556 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.647565 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.647576 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.647775 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.649611 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" event={"ID":"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b","Type":"ContainerStarted","Data":"43d51f53db46cd4c2369d71b11a391317a14aac1292bbf541a3b85b9a2b7ba60"} Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.750482 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.750509 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.750522 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.750535 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.750543 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.852254 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.852282 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.852291 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.852303 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.852311 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.954314 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.954339 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.954347 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.954358 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:46 crc kubenswrapper[4475]: I1203 06:45:46.954367 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:46Z","lastTransitionTime":"2025-12-03T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.056269 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.056302 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.056311 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.056324 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.056332 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:47Z","lastTransitionTime":"2025-12-03T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.158184 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.158225 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.158233 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.158245 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.158252 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:47Z","lastTransitionTime":"2025-12-03T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.259821 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.259845 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.259852 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.259863 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.259869 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:47Z","lastTransitionTime":"2025-12-03T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.361276 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.361306 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.361314 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.361327 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.361335 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:47Z","lastTransitionTime":"2025-12-03T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.462808 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.462831 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.462838 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.462849 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.462857 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:47Z","lastTransitionTime":"2025-12-03T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.564151 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.564172 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.564179 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.564189 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.564196 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:47Z","lastTransitionTime":"2025-12-03T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.652983 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" event={"ID":"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b","Type":"ContainerStarted","Data":"4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.653008 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" event={"ID":"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b","Type":"ContainerStarted","Data":"5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.661282 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.665933 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.665965 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.665973 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.665986 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.665994 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:47Z","lastTransitionTime":"2025-12-03T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.672116 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.680432 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.688101 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.696063 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.702905 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.709866 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.721905 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.729410 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.737203 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.744769 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.752393 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.759856 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.766346 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.767521 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.767544 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.767553 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.767564 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.767572 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:47Z","lastTransitionTime":"2025-12-03T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.774636 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.788766 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:44Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:45:44.234950 5806 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.869186 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.869230 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.869239 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.869253 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.869262 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:47Z","lastTransitionTime":"2025-12-03T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.971530 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.971563 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.971572 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.971588 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.971598 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:47Z","lastTransitionTime":"2025-12-03T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.990023 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-hq2rn"] Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.990397 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:47 crc kubenswrapper[4475]: E1203 06:45:47.990576 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:45:47 crc kubenswrapper[4475]: I1203 06:45:47.997871 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.006668 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.014688 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.022358 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.028961 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.041154 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.047601 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.054710 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.060921 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.069233 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.072873 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.072932 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.072942 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.072956 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.072965 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:48Z","lastTransitionTime":"2025-12-03T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.081179 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:44Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:45:44.234950 5806 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.088017 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg4hm\" (UniqueName: \"kubernetes.io/projected/7e9dd470-572a-4396-9be7-1a37e3c48977-kube-api-access-cg4hm\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.088042 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.089474 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.097031 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.104136 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.110330 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.117594 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.126469 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.174888 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.174915 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.174925 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.174937 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.174946 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:48Z","lastTransitionTime":"2025-12-03T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.189272 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189398 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:46:04.189383549 +0000 UTC m=+48.994281893 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.189489 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189577 4475 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189620 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:46:04.189611739 +0000 UTC m=+48.994510074 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.189645 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.189674 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.189694 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.189716 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg4hm\" (UniqueName: \"kubernetes.io/projected/7e9dd470-572a-4396-9be7-1a37e3c48977-kube-api-access-cg4hm\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189809 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189823 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.189840 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189843 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189859 4475 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189886 4475 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189825 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189904 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:46:04.189891579 +0000 UTC m=+48.994789913 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189868 4475 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189921 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs podName:7e9dd470-572a-4396-9be7-1a37e3c48977 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:48.689914602 +0000 UTC m=+33.494812936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs") pod "network-metrics-daemon-hq2rn" (UID: "7e9dd470-572a-4396-9be7-1a37e3c48977") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189939 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:46:04.189932315 +0000 UTC m=+48.994830649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.189905 4475 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.190005 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:46:04.189991026 +0000 UTC m=+48.994889360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.202893 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg4hm\" (UniqueName: \"kubernetes.io/projected/7e9dd470-572a-4396-9be7-1a37e3c48977-kube-api-access-cg4hm\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.277141 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.277165 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.277174 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.277186 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.277195 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:48Z","lastTransitionTime":"2025-12-03T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.378696 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.378731 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.378741 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.378754 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.378762 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:48Z","lastTransitionTime":"2025-12-03T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.480631 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.480660 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.480669 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.480682 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.480699 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:48Z","lastTransitionTime":"2025-12-03T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.490876 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.490877 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.491097 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.490966 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.491241 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.491307 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.582763 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.582858 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.582915 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.582977 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.583030 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:48Z","lastTransitionTime":"2025-12-03T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.684505 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.684624 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.684688 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.684741 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.684789 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:48Z","lastTransitionTime":"2025-12-03T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.692964 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.693433 4475 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: E1203 06:45:48.693559 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs podName:7e9dd470-572a-4396-9be7-1a37e3c48977 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:49.69354728 +0000 UTC m=+34.498445614 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs") pod "network-metrics-daemon-hq2rn" (UID: "7e9dd470-572a-4396-9be7-1a37e3c48977") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.786409 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.786441 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.786468 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.786481 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.786490 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:48Z","lastTransitionTime":"2025-12-03T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.888179 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.888285 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.888360 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.888426 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.888506 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:48Z","lastTransitionTime":"2025-12-03T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.990363 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.990614 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.990675 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.990736 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:48 crc kubenswrapper[4475]: I1203 06:45:48.990788 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:48Z","lastTransitionTime":"2025-12-03T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.093265 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.093293 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.093301 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.093314 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.093323 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:49Z","lastTransitionTime":"2025-12-03T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.194933 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.194966 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.194975 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.194989 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.195002 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:49Z","lastTransitionTime":"2025-12-03T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.297229 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.297270 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.297281 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.297297 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.297309 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:49Z","lastTransitionTime":"2025-12-03T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.399045 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.399065 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.399072 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.399082 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.399090 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:49Z","lastTransitionTime":"2025-12-03T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.490881 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:49 crc kubenswrapper[4475]: E1203 06:45:49.490989 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.500274 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.500347 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.500357 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.500387 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.500397 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:49Z","lastTransitionTime":"2025-12-03T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.601642 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.601674 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.601682 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.601692 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.601700 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:49Z","lastTransitionTime":"2025-12-03T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.701808 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:49 crc kubenswrapper[4475]: E1203 06:45:49.701918 4475 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:45:49 crc kubenswrapper[4475]: E1203 06:45:49.701970 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs podName:7e9dd470-572a-4396-9be7-1a37e3c48977 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:51.70195908 +0000 UTC m=+36.506857415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs") pod "network-metrics-daemon-hq2rn" (UID: "7e9dd470-572a-4396-9be7-1a37e3c48977") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.702869 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.702903 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.702912 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.702923 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.702931 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:49Z","lastTransitionTime":"2025-12-03T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.804242 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.804274 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.804283 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.804296 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.804305 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:49Z","lastTransitionTime":"2025-12-03T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.906250 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.906283 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.906291 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.906307 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:49 crc kubenswrapper[4475]: I1203 06:45:49.906316 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:49Z","lastTransitionTime":"2025-12-03T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.007590 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.007622 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.007631 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.007643 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.007652 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:50Z","lastTransitionTime":"2025-12-03T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.108671 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.108701 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.108709 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.108720 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.108728 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:50Z","lastTransitionTime":"2025-12-03T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.210472 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.210521 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.210534 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.210551 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.210760 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:50Z","lastTransitionTime":"2025-12-03T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.312140 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.312175 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.312185 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.312200 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.312209 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:50Z","lastTransitionTime":"2025-12-03T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.413987 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.414017 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.414026 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.414036 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.414045 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:50Z","lastTransitionTime":"2025-12-03T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.490746 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:50 crc kubenswrapper[4475]: E1203 06:45:50.490826 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.490858 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:50 crc kubenswrapper[4475]: E1203 06:45:50.490949 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.491045 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:50 crc kubenswrapper[4475]: E1203 06:45:50.491136 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.515709 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.515815 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.515885 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.515951 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.516005 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:50Z","lastTransitionTime":"2025-12-03T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.617227 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.617327 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.617383 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.617435 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.617507 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:50Z","lastTransitionTime":"2025-12-03T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.719430 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.719473 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.719482 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.719492 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.719502 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:50Z","lastTransitionTime":"2025-12-03T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.821215 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.821360 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.821416 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.821500 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.821555 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:50Z","lastTransitionTime":"2025-12-03T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.923093 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.923115 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.923123 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.923135 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:50 crc kubenswrapper[4475]: I1203 06:45:50.923142 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:50Z","lastTransitionTime":"2025-12-03T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.024645 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.024684 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.024698 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.024716 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.024724 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:51Z","lastTransitionTime":"2025-12-03T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.126512 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.126626 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.126698 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.126770 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.126838 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:51Z","lastTransitionTime":"2025-12-03T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.228465 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.228523 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.228532 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.228545 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.228556 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:51Z","lastTransitionTime":"2025-12-03T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.330008 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.330030 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.330038 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.330046 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.330052 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:51Z","lastTransitionTime":"2025-12-03T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.431879 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.431907 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.431914 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.431923 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.431930 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:51Z","lastTransitionTime":"2025-12-03T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.491006 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:51 crc kubenswrapper[4475]: E1203 06:45:51.491088 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.535162 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.535196 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.535207 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.535219 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.535232 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:51Z","lastTransitionTime":"2025-12-03T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.636739 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.636783 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.636791 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.636805 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.636815 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:51Z","lastTransitionTime":"2025-12-03T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.717781 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:51 crc kubenswrapper[4475]: E1203 06:45:51.717873 4475 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:45:51 crc kubenswrapper[4475]: E1203 06:45:51.717931 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs podName:7e9dd470-572a-4396-9be7-1a37e3c48977 nodeName:}" failed. No retries permitted until 2025-12-03 06:45:55.717916536 +0000 UTC m=+40.522814880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs") pod "network-metrics-daemon-hq2rn" (UID: "7e9dd470-572a-4396-9be7-1a37e3c48977") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.738929 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.738966 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.738978 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.738992 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.739000 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:51Z","lastTransitionTime":"2025-12-03T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.840821 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.840850 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.840858 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.840867 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.840874 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:51Z","lastTransitionTime":"2025-12-03T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.942744 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.942778 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.942786 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.942799 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:51 crc kubenswrapper[4475]: I1203 06:45:51.942807 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:51Z","lastTransitionTime":"2025-12-03T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.044433 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.044482 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.044493 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.044504 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.044513 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:52Z","lastTransitionTime":"2025-12-03T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.146393 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.146429 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.146439 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.146472 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.146483 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:52Z","lastTransitionTime":"2025-12-03T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.247758 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.247785 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.247795 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.247805 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.247827 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:52Z","lastTransitionTime":"2025-12-03T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.349483 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.349511 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.349519 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.349529 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.349539 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:52Z","lastTransitionTime":"2025-12-03T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.451224 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.451265 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.451274 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.451286 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.451294 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:52Z","lastTransitionTime":"2025-12-03T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.490640 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.490666 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.490667 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:52 crc kubenswrapper[4475]: E1203 06:45:52.490722 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:52 crc kubenswrapper[4475]: E1203 06:45:52.490761 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:52 crc kubenswrapper[4475]: E1203 06:45:52.490802 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.552503 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.552540 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.552549 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.552562 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.552571 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:52Z","lastTransitionTime":"2025-12-03T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.654555 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.654600 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.654610 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.654622 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.654630 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:52Z","lastTransitionTime":"2025-12-03T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.755987 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.756016 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.756026 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.756037 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.756046 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:52Z","lastTransitionTime":"2025-12-03T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.857631 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.857682 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.857704 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.857716 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.857724 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:52Z","lastTransitionTime":"2025-12-03T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.959415 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.959473 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.959485 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.959497 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:52 crc kubenswrapper[4475]: I1203 06:45:52.959505 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:52Z","lastTransitionTime":"2025-12-03T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.060981 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.061068 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.061077 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.061087 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.061095 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:53Z","lastTransitionTime":"2025-12-03T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.162713 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.162743 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.162751 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.162764 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.162774 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:53Z","lastTransitionTime":"2025-12-03T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.264290 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.264340 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.264349 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.264363 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.264371 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:53Z","lastTransitionTime":"2025-12-03T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.365975 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.365997 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.366007 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.366017 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.366025 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:53Z","lastTransitionTime":"2025-12-03T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.467746 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.467777 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.467786 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.467795 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.467804 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:53Z","lastTransitionTime":"2025-12-03T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.491118 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:53 crc kubenswrapper[4475]: E1203 06:45:53.491216 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.569033 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.569077 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.569089 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.569098 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.569105 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:53Z","lastTransitionTime":"2025-12-03T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.670561 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.670591 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.670604 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.670618 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.670627 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:53Z","lastTransitionTime":"2025-12-03T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.771870 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.771900 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.771908 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.771921 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.771928 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:53Z","lastTransitionTime":"2025-12-03T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.873210 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.873250 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.873268 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.873279 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.873287 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:53Z","lastTransitionTime":"2025-12-03T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.975080 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.975110 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.975118 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.975130 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:53 crc kubenswrapper[4475]: I1203 06:45:53.975139 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:53Z","lastTransitionTime":"2025-12-03T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.076881 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.076913 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.076922 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.076939 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.076948 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:54Z","lastTransitionTime":"2025-12-03T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.178642 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.178679 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.178691 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.178705 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.178715 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:54Z","lastTransitionTime":"2025-12-03T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.213941 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.214489 4475 scope.go:117] "RemoveContainer" containerID="6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c" Dec 03 06:45:54 crc kubenswrapper[4475]: E1203 06:45:54.214606 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.280169 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.280187 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.280195 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.280205 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.280212 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:54Z","lastTransitionTime":"2025-12-03T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.381434 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.381501 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.381510 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.381523 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.381531 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:54Z","lastTransitionTime":"2025-12-03T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.483491 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.483524 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.483534 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.483545 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.483554 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:54Z","lastTransitionTime":"2025-12-03T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.490708 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.490817 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.490876 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:54 crc kubenswrapper[4475]: E1203 06:45:54.490939 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:54 crc kubenswrapper[4475]: E1203 06:45:54.491017 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:54 crc kubenswrapper[4475]: E1203 06:45:54.491082 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.585484 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.585596 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.585659 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.585717 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.585771 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:54Z","lastTransitionTime":"2025-12-03T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.687488 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.687532 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.687542 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.687552 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.687559 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:54Z","lastTransitionTime":"2025-12-03T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.788754 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.788778 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.788786 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.788795 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.788803 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:54Z","lastTransitionTime":"2025-12-03T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.890382 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.890413 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.890430 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.890474 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.890484 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:54Z","lastTransitionTime":"2025-12-03T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.992391 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.992609 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.992687 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.992751 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:54 crc kubenswrapper[4475]: I1203 06:45:54.992816 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:54Z","lastTransitionTime":"2025-12-03T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.094635 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.094736 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.094805 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.094870 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.094936 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:55Z","lastTransitionTime":"2025-12-03T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.196052 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.196084 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.196092 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.196105 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.196113 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:55Z","lastTransitionTime":"2025-12-03T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.297785 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.297817 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.297825 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.297853 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.297862 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:55Z","lastTransitionTime":"2025-12-03T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.399202 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.399236 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.399244 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.399257 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.399266 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:55Z","lastTransitionTime":"2025-12-03T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.491286 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:55 crc kubenswrapper[4475]: E1203 06:45:55.491393 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.499377 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.500474 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.500498 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.500507 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.500517 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.500525 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:55Z","lastTransitionTime":"2025-12-03T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.509119 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.517465 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.524961 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.531904 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.544589 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.552236 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.559553 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.567176 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.576240 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.588307 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:44Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:45:44.234950 5806 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.596478 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.602942 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.602967 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.602976 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.602989 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.602997 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:55Z","lastTransitionTime":"2025-12-03T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.604653 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.611845 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.618791 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.626680 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.635470 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:55Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.704530 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.704555 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.704563 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.704574 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.704582 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:55Z","lastTransitionTime":"2025-12-03T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.745343 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:55 crc kubenswrapper[4475]: E1203 06:45:55.745430 4475 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:45:55 crc kubenswrapper[4475]: E1203 06:45:55.745487 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs podName:7e9dd470-572a-4396-9be7-1a37e3c48977 nodeName:}" failed. No retries permitted until 2025-12-03 06:46:03.7454749 +0000 UTC m=+48.550373234 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs") pod "network-metrics-daemon-hq2rn" (UID: "7e9dd470-572a-4396-9be7-1a37e3c48977") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.806489 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.806511 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.806518 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.806529 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.806536 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:55Z","lastTransitionTime":"2025-12-03T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.908062 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.908102 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.908110 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.908120 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:55 crc kubenswrapper[4475]: I1203 06:45:55.908128 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:55Z","lastTransitionTime":"2025-12-03T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.009395 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.009423 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.009432 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.009444 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.009471 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.110780 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.110808 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.110816 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.110827 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.110834 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.212780 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.212806 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.212814 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.212825 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.212832 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.248850 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.248984 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.249055 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.249121 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.249230 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: E1203 06:45:56.257789 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:56Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.260338 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.260435 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.260521 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.260582 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.260642 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: E1203 06:45:56.269911 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:56Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.272236 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.272338 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.272397 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.272475 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.272540 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: E1203 06:45:56.280420 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:56Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.282560 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.282630 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.282694 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.282755 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.282804 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: E1203 06:45:56.290596 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:56Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.292635 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.292657 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.292665 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.292674 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.292681 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: E1203 06:45:56.303342 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:45:56Z is after 2025-08-24T17:21:41Z" Dec 03 06:45:56 crc kubenswrapper[4475]: E1203 06:45:56.303474 4475 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.314498 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.314522 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.314530 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.314538 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.314546 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.416585 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.416609 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.416617 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.416627 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.416635 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.490691 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.490690 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:56 crc kubenswrapper[4475]: E1203 06:45:56.490860 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:56 crc kubenswrapper[4475]: E1203 06:45:56.490794 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.490697 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:56 crc kubenswrapper[4475]: E1203 06:45:56.490916 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.518401 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.518428 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.518438 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.518463 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.518473 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.620426 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.620476 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.620491 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.620502 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.620510 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.722488 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.722512 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.722519 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.722529 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.722538 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.824270 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.824333 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.824342 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.824352 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.824360 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.925745 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.925769 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.925777 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.925786 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:56 crc kubenswrapper[4475]: I1203 06:45:56.925794 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:56Z","lastTransitionTime":"2025-12-03T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.027737 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.027761 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.027769 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.027778 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.027785 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:57Z","lastTransitionTime":"2025-12-03T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.129846 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.129872 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.129879 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.129891 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.129898 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:57Z","lastTransitionTime":"2025-12-03T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.231653 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.231810 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.231891 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.231986 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.232069 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:57Z","lastTransitionTime":"2025-12-03T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.333793 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.333959 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.334048 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.334125 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.334201 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:57Z","lastTransitionTime":"2025-12-03T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.436420 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.436469 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.436479 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.436494 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.436503 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:57Z","lastTransitionTime":"2025-12-03T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.490341 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:57 crc kubenswrapper[4475]: E1203 06:45:57.490474 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.537898 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.537923 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.537932 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.537941 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.537949 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:57Z","lastTransitionTime":"2025-12-03T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.639140 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.639324 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.639490 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.639635 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.639701 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:57Z","lastTransitionTime":"2025-12-03T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.741738 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.741765 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.741773 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.741783 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.741790 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:57Z","lastTransitionTime":"2025-12-03T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.843504 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.843532 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.843539 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.843550 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.843557 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:57Z","lastTransitionTime":"2025-12-03T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.945373 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.945396 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.945403 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.945413 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:57 crc kubenswrapper[4475]: I1203 06:45:57.945420 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:57Z","lastTransitionTime":"2025-12-03T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.047270 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.047305 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.047313 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.047323 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.047330 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:58Z","lastTransitionTime":"2025-12-03T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.148958 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.148985 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.148995 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.149006 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.149014 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:58Z","lastTransitionTime":"2025-12-03T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.250781 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.250823 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.250833 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.250847 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.250855 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:58Z","lastTransitionTime":"2025-12-03T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.352624 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.352726 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.352789 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.352865 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.352926 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:58Z","lastTransitionTime":"2025-12-03T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.454044 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.454065 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.454074 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.454084 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.454092 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:58Z","lastTransitionTime":"2025-12-03T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.491142 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.491167 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:45:58 crc kubenswrapper[4475]: E1203 06:45:58.491229 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.491319 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:45:58 crc kubenswrapper[4475]: E1203 06:45:58.491399 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:45:58 crc kubenswrapper[4475]: E1203 06:45:58.491494 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.555948 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.555976 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.555985 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.555996 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.556006 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:58Z","lastTransitionTime":"2025-12-03T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.658062 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.658088 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.658095 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.658107 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.658115 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:58Z","lastTransitionTime":"2025-12-03T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.760141 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.760273 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.760357 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.760419 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.760496 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:58Z","lastTransitionTime":"2025-12-03T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.862864 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.862897 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.862906 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.862917 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.862925 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:58Z","lastTransitionTime":"2025-12-03T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.964231 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.964264 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.964273 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.964286 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:58 crc kubenswrapper[4475]: I1203 06:45:58.964294 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:58Z","lastTransitionTime":"2025-12-03T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.065960 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.065991 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.065999 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.066011 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.066020 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:59Z","lastTransitionTime":"2025-12-03T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.167969 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.168027 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.168038 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.168051 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.168061 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:59Z","lastTransitionTime":"2025-12-03T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.269720 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.269814 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.269840 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.269852 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.269860 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:59Z","lastTransitionTime":"2025-12-03T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.371611 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.371644 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.371653 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.371685 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.371695 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:59Z","lastTransitionTime":"2025-12-03T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.473955 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.474001 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.474011 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.474023 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.474030 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:59Z","lastTransitionTime":"2025-12-03T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.490232 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:45:59 crc kubenswrapper[4475]: E1203 06:45:59.490426 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.575670 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.575698 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.575706 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.575716 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.575724 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:59Z","lastTransitionTime":"2025-12-03T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.676896 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.676926 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.676934 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.676947 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.676955 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:59Z","lastTransitionTime":"2025-12-03T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.778347 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.778375 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.778383 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.778395 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.778402 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:59Z","lastTransitionTime":"2025-12-03T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.880406 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.880432 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.880440 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.880466 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.880474 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:59Z","lastTransitionTime":"2025-12-03T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.982061 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.982095 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.982104 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.982118 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:45:59 crc kubenswrapper[4475]: I1203 06:45:59.982127 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:45:59Z","lastTransitionTime":"2025-12-03T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.083500 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.083527 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.083535 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.083545 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.083552 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:00Z","lastTransitionTime":"2025-12-03T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.185328 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.185366 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.185377 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.185392 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.185404 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:00Z","lastTransitionTime":"2025-12-03T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.286948 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.286979 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.286987 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.286999 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.287008 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:00Z","lastTransitionTime":"2025-12-03T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.388951 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.388985 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.388993 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.389005 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.389014 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:00Z","lastTransitionTime":"2025-12-03T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.490114 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.490140 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.490182 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:00 crc kubenswrapper[4475]: E1203 06:46:00.490281 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.490348 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.490365 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.490374 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.490385 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.490394 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:00Z","lastTransitionTime":"2025-12-03T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:00 crc kubenswrapper[4475]: E1203 06:46:00.490435 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:00 crc kubenswrapper[4475]: E1203 06:46:00.490378 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.592156 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.592202 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.592212 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.592226 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.592234 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:00Z","lastTransitionTime":"2025-12-03T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.694575 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.694622 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.694631 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.694644 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.694653 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:00Z","lastTransitionTime":"2025-12-03T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.796178 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.796209 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.796218 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.796232 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.796241 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:00Z","lastTransitionTime":"2025-12-03T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.898061 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.898102 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.898115 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.898131 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:00 crc kubenswrapper[4475]: I1203 06:46:00.898143 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:00Z","lastTransitionTime":"2025-12-03T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.000044 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.000071 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.000079 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.000091 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.000100 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:01Z","lastTransitionTime":"2025-12-03T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.101421 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.101444 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.101468 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.101478 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.101485 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:01Z","lastTransitionTime":"2025-12-03T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.203529 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.203557 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.203572 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.203585 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.203634 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:01Z","lastTransitionTime":"2025-12-03T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.305814 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.305846 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.305854 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.305865 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.305873 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:01Z","lastTransitionTime":"2025-12-03T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.407737 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.407769 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.407777 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.407789 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.407797 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:01Z","lastTransitionTime":"2025-12-03T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.490789 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:01 crc kubenswrapper[4475]: E1203 06:46:01.490902 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.510082 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.510114 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.510124 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.510135 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.510144 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:01Z","lastTransitionTime":"2025-12-03T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.612228 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.612252 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.612260 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.612278 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.612297 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:01Z","lastTransitionTime":"2025-12-03T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.713428 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.713476 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.713485 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.713495 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.713502 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:01Z","lastTransitionTime":"2025-12-03T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.815438 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.815598 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.815667 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.815732 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.815785 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:01Z","lastTransitionTime":"2025-12-03T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.917529 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.917555 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.917563 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.917574 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:01 crc kubenswrapper[4475]: I1203 06:46:01.917581 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:01Z","lastTransitionTime":"2025-12-03T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.019529 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.019555 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.019581 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.019591 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.019613 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:02Z","lastTransitionTime":"2025-12-03T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.121506 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.121539 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.121547 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.121558 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.121566 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:02Z","lastTransitionTime":"2025-12-03T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.223185 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.223207 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.223215 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.223225 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.223235 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:02Z","lastTransitionTime":"2025-12-03T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.325079 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.325201 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.325265 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.325338 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.325402 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:02Z","lastTransitionTime":"2025-12-03T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.426585 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.426612 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.426622 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.426632 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.426641 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:02Z","lastTransitionTime":"2025-12-03T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.490595 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.490623 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.490640 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:02 crc kubenswrapper[4475]: E1203 06:46:02.490704 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:02 crc kubenswrapper[4475]: E1203 06:46:02.490755 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:02 crc kubenswrapper[4475]: E1203 06:46:02.490809 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.527861 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.527887 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.527896 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.527908 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.527917 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:02Z","lastTransitionTime":"2025-12-03T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.606634 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.612943 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.620856 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.626978 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.629099 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.629127 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.629134 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.629145 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.629153 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:02Z","lastTransitionTime":"2025-12-03T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.633941 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.641313 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.652491 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:44Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:45:44.234950 5806 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.660418 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.667788 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.674556 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.681441 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.689144 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.696467 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.704820 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.713716 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.721872 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.730619 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.730645 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.730653 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.730666 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.730674 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:02Z","lastTransitionTime":"2025-12-03T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.734064 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.744709 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.751773 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.832367 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.832394 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.832403 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.832416 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.832426 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:02Z","lastTransitionTime":"2025-12-03T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.934365 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.934394 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.934403 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.934415 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:02 crc kubenswrapper[4475]: I1203 06:46:02.934423 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:02Z","lastTransitionTime":"2025-12-03T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.036075 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.036097 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.036105 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.036115 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.036122 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:03Z","lastTransitionTime":"2025-12-03T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.137858 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.137881 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.137889 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.137899 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.137906 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:03Z","lastTransitionTime":"2025-12-03T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.239261 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.239294 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.239304 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.239315 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.239322 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:03Z","lastTransitionTime":"2025-12-03T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.340979 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.341015 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.341023 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.341036 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.341045 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:03Z","lastTransitionTime":"2025-12-03T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.442362 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.442411 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.442421 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.442433 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.442444 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:03Z","lastTransitionTime":"2025-12-03T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.490330 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:03 crc kubenswrapper[4475]: E1203 06:46:03.490446 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.543710 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.543745 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.543755 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.543766 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.543774 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:03Z","lastTransitionTime":"2025-12-03T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.645400 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.645432 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.645440 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.645469 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.645480 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:03Z","lastTransitionTime":"2025-12-03T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.746915 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.746946 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.746954 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.746966 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.746974 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:03Z","lastTransitionTime":"2025-12-03T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.806605 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:03 crc kubenswrapper[4475]: E1203 06:46:03.806688 4475 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:46:03 crc kubenswrapper[4475]: E1203 06:46:03.806728 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs podName:7e9dd470-572a-4396-9be7-1a37e3c48977 nodeName:}" failed. No retries permitted until 2025-12-03 06:46:19.806717507 +0000 UTC m=+64.611615840 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs") pod "network-metrics-daemon-hq2rn" (UID: "7e9dd470-572a-4396-9be7-1a37e3c48977") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.848735 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.848772 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.848780 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.848792 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.848800 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:03Z","lastTransitionTime":"2025-12-03T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.950724 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.950754 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.950763 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.950774 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:03 crc kubenswrapper[4475]: I1203 06:46:03.950783 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:03Z","lastTransitionTime":"2025-12-03T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.052291 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.052325 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.052335 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.052361 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.052373 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:04Z","lastTransitionTime":"2025-12-03T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.153917 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.153949 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.153957 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.153969 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.153978 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:04Z","lastTransitionTime":"2025-12-03T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.209903 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.209952 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.209975 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.210003 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210078 4475 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210099 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210127 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210133 4475 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210139 4475 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210173 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:46:36.210035567 +0000 UTC m=+81.014933911 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.210193 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210276 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210287 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210295 4475 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210332 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:46:36.210214575 +0000 UTC m=+81.015112899 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210352 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:46:36.210338459 +0000 UTC m=+81.015236793 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210363 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:46:36.210358717 +0000 UTC m=+81.015257051 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.210371 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:46:36.210367634 +0000 UTC m=+81.015265968 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.256076 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.256106 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.256114 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.256127 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.256135 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:04Z","lastTransitionTime":"2025-12-03T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.357664 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.357691 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.357698 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.357711 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.357719 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:04Z","lastTransitionTime":"2025-12-03T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.459648 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.459672 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.459679 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.459690 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.459698 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:04Z","lastTransitionTime":"2025-12-03T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.490136 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.490222 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.490136 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.490289 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.490145 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:04 crc kubenswrapper[4475]: E1203 06:46:04.490337 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.561283 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.561321 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.561329 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.561338 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.561355 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:04Z","lastTransitionTime":"2025-12-03T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.663025 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.663066 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.663074 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.663090 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.663098 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:04Z","lastTransitionTime":"2025-12-03T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.764186 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.764225 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.764233 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.764244 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.764252 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:04Z","lastTransitionTime":"2025-12-03T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.866123 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.866162 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.866172 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.866185 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.866197 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:04Z","lastTransitionTime":"2025-12-03T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.970772 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.970797 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.970806 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.970817 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:04 crc kubenswrapper[4475]: I1203 06:46:04.970825 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:04Z","lastTransitionTime":"2025-12-03T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.072484 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.072509 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.072517 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.072532 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.072543 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:05Z","lastTransitionTime":"2025-12-03T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.174631 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.174682 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.174691 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.174705 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.174713 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:05Z","lastTransitionTime":"2025-12-03T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.276301 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.276333 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.276343 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.276364 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.276372 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:05Z","lastTransitionTime":"2025-12-03T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.378103 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.378132 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.378140 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.378152 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.378160 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:05Z","lastTransitionTime":"2025-12-03T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.479874 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.479898 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.479906 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.479917 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.479925 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:05Z","lastTransitionTime":"2025-12-03T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.490650 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:05 crc kubenswrapper[4475]: E1203 06:46:05.490841 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.505068 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.512216 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.520856 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.529557 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.537697 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.546531 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.556216 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.564298 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.576922 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:44Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:45:44.234950 5806 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.581766 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.581796 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.581804 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.581819 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.581827 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:05Z","lastTransitionTime":"2025-12-03T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.586010 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.592849 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.601866 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.609415 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.617674 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.625199 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.631857 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.640620 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.650909 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:05Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.684035 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.684061 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.684069 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.684080 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.684088 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:05Z","lastTransitionTime":"2025-12-03T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.786079 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.786113 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.786122 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.786134 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.786143 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:05Z","lastTransitionTime":"2025-12-03T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.887951 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.887978 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.887987 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.887996 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.888004 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:05Z","lastTransitionTime":"2025-12-03T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.990167 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.990202 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.990211 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.990224 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:05 crc kubenswrapper[4475]: I1203 06:46:05.990233 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:05Z","lastTransitionTime":"2025-12-03T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.092312 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.092432 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.092530 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.092603 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.092656 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.194110 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.194141 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.194152 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.194181 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.194190 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.295743 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.295845 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.295914 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.295983 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.296042 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.397993 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.398032 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.398041 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.398055 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.398065 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.485743 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.485785 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.485795 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.485805 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.485812 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.490654 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.490750 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:06 crc kubenswrapper[4475]: E1203 06:46:06.490779 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.490661 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:06 crc kubenswrapper[4475]: E1203 06:46:06.490964 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:06 crc kubenswrapper[4475]: E1203 06:46:06.490882 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:06 crc kubenswrapper[4475]: E1203 06:46:06.494755 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:06Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.497171 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.497260 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.497324 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.497400 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.497485 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: E1203 06:46:06.505868 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:06Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.508236 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.508260 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.508269 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.508279 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.508303 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: E1203 06:46:06.516561 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:06Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.518966 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.518993 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.519001 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.519010 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.519017 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: E1203 06:46:06.526626 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:06Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.528634 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.528661 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.528670 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.528679 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.528686 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: E1203 06:46:06.536414 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:06Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:06 crc kubenswrapper[4475]: E1203 06:46:06.536534 4475 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.537480 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.537503 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.537512 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.537521 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.537529 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.639588 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.639617 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.639626 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.639637 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.639645 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.741569 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.741602 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.741611 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.741623 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.741642 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.842944 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.842974 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.842982 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.842996 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.843004 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.944949 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.944978 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.944988 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.944999 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:06 crc kubenswrapper[4475]: I1203 06:46:06.945008 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:06Z","lastTransitionTime":"2025-12-03T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.046410 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.046437 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.046445 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.046482 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.046490 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:07Z","lastTransitionTime":"2025-12-03T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.148297 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.148339 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.148349 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.148362 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.148381 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:07Z","lastTransitionTime":"2025-12-03T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.250092 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.250123 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.250133 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.250145 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.250153 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:07Z","lastTransitionTime":"2025-12-03T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.351856 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.352061 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.352119 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.352192 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.352250 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:07Z","lastTransitionTime":"2025-12-03T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.456841 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.456870 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.456879 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.456891 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.456899 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:07Z","lastTransitionTime":"2025-12-03T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.490255 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:07 crc kubenswrapper[4475]: E1203 06:46:07.490571 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.490708 4475 scope.go:117] "RemoveContainer" containerID="6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.558667 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.558839 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.558849 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.558864 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.558873 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:07Z","lastTransitionTime":"2025-12-03T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.660761 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.660793 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.660801 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.660815 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.660823 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:07Z","lastTransitionTime":"2025-12-03T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.697925 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/1.log" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.703814 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e"} Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.704151 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.725559 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.737064 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.745602 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.759915 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.762428 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.762478 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.762487 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.762499 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.762508 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:07Z","lastTransitionTime":"2025-12-03T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.775668 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:44Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:45:44.234950 5806 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.783008 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.791736 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.799438 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.807287 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.815136 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.823043 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.831009 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.841262 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.849782 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.858746 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.864080 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.864105 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.864112 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.864125 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.864133 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:07Z","lastTransitionTime":"2025-12-03T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.866831 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.874150 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.881115 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.966174 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.966207 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.966216 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.966228 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:07 crc kubenswrapper[4475]: I1203 06:46:07.966238 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:07Z","lastTransitionTime":"2025-12-03T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.068011 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.068045 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.068054 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.068067 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.068075 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:08Z","lastTransitionTime":"2025-12-03T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.169726 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.169756 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.169764 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.169777 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.169785 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:08Z","lastTransitionTime":"2025-12-03T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.270876 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.270900 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.270908 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.270925 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.270933 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:08Z","lastTransitionTime":"2025-12-03T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.373132 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.373179 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.373188 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.373200 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.373208 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:08Z","lastTransitionTime":"2025-12-03T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.474955 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.474982 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.474990 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.475002 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.475009 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:08Z","lastTransitionTime":"2025-12-03T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.490189 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.490327 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:08 crc kubenswrapper[4475]: E1203 06:46:08.490463 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.490487 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:08 crc kubenswrapper[4475]: E1203 06:46:08.490641 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:08 crc kubenswrapper[4475]: E1203 06:46:08.490763 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.576992 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.577018 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.577026 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.577036 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.577043 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:08Z","lastTransitionTime":"2025-12-03T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.679187 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.679211 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.679219 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.679233 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.679241 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:08Z","lastTransitionTime":"2025-12-03T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.707370 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/2.log" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.707873 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/1.log" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.709751 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e" exitCode=1 Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.709767 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e"} Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.709798 4475 scope.go:117] "RemoveContainer" containerID="6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.710364 4475 scope.go:117] "RemoveContainer" containerID="dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e" Dec 03 06:46:08 crc kubenswrapper[4475]: E1203 06:46:08.710856 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.724201 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.731007 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.738535 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.745049 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.753275 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.764864 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab275be2e84c1b20f69d740d454b4916d2fb2af864c685198786088b835b49c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:45:44Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:45:44.234950 5806 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:08Z\\\",\\\"message\\\":\\\"gins-k9cmc\\\\nI1203 06:46:08.075193 6119 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075199 6119 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-k9cmc in node crc\\\\nI1203 06:46:08.075203 6119 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc after 0 failed attempt(s)\\\\nI1203 06:46:08.075207 6119 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075218 6119 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 06:46:08.075238 6119 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}\\\\nI1203 06:46:08.075253 6119 services_controller.go:360] Finished syncing service redhat-marketplace on namespace openshift-marketplace for network=default : 1.255581ms\\\\nF1203 06:46:08.075260 6119 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:46:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.771622 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.780827 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.780874 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.780884 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.780897 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.780906 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:08Z","lastTransitionTime":"2025-12-03T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.781735 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.790125 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.798071 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.805276 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.812985 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.821930 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.828967 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.837316 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.846143 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.853682 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.860141 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.882257 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.882287 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.882296 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.882309 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.882318 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:08Z","lastTransitionTime":"2025-12-03T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.983550 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.983577 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.983586 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.983596 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:08 crc kubenswrapper[4475]: I1203 06:46:08.983604 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:08Z","lastTransitionTime":"2025-12-03T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.085778 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.085820 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.085829 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.085841 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.085850 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:09Z","lastTransitionTime":"2025-12-03T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.187851 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.187890 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.187898 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.187911 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.187920 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:09Z","lastTransitionTime":"2025-12-03T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.290083 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.290109 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.290117 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.290129 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.290137 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:09Z","lastTransitionTime":"2025-12-03T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.391821 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.391854 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.391864 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.391877 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.391885 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:09Z","lastTransitionTime":"2025-12-03T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.491112 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:09 crc kubenswrapper[4475]: E1203 06:46:09.491208 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.493077 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.493098 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.493106 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.493115 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.493123 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:09Z","lastTransitionTime":"2025-12-03T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.594922 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.594960 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.594971 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.594984 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.594994 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:09Z","lastTransitionTime":"2025-12-03T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.696791 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.696835 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.696844 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.696856 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.696865 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:09Z","lastTransitionTime":"2025-12-03T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.712586 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/2.log" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.714677 4475 scope.go:117] "RemoveContainer" containerID="dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e" Dec 03 06:46:09 crc kubenswrapper[4475]: E1203 06:46:09.714789 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.727990 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.734526 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.742849 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.750400 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.757424 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.764971 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.771107 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.780426 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.792668 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:08Z\\\",\\\"message\\\":\\\"gins-k9cmc\\\\nI1203 06:46:08.075193 6119 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075199 6119 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-k9cmc in node crc\\\\nI1203 06:46:08.075203 6119 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc after 0 failed attempt(s)\\\\nI1203 06:46:08.075207 6119 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075218 6119 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 06:46:08.075238 6119 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}\\\\nI1203 06:46:08.075253 6119 services_controller.go:360] Finished syncing service redhat-marketplace on namespace openshift-marketplace for network=default : 1.255581ms\\\\nF1203 06:46:08.075260 6119 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:46:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.799125 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.799149 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.799158 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.799169 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.799177 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:09Z","lastTransitionTime":"2025-12-03T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.800044 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.806143 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.814895 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.822097 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.829881 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.837239 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.844046 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.850977 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.859353 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.900879 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.900908 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.900916 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.900929 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:09 crc kubenswrapper[4475]: I1203 06:46:09.900937 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:09Z","lastTransitionTime":"2025-12-03T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.002699 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.002722 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.002731 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.002745 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.002770 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:10Z","lastTransitionTime":"2025-12-03T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.105040 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.105070 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.105078 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.105090 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.105098 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:10Z","lastTransitionTime":"2025-12-03T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.206824 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.206880 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.206891 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.206904 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.206913 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:10Z","lastTransitionTime":"2025-12-03T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.308754 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.308782 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.308791 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.308802 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.308810 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:10Z","lastTransitionTime":"2025-12-03T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.410418 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.410476 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.410486 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.410501 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.410511 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:10Z","lastTransitionTime":"2025-12-03T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.490683 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.490717 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.490750 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:10 crc kubenswrapper[4475]: E1203 06:46:10.490848 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:10 crc kubenswrapper[4475]: E1203 06:46:10.490903 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:10 crc kubenswrapper[4475]: E1203 06:46:10.490965 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.511885 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.511914 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.511924 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.511935 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.511943 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:10Z","lastTransitionTime":"2025-12-03T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.613408 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.613431 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.613439 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.613488 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.613496 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:10Z","lastTransitionTime":"2025-12-03T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.716438 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.716473 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.716484 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.716494 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.716503 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:10Z","lastTransitionTime":"2025-12-03T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.818215 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.818245 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.818253 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.818264 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.818272 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:10Z","lastTransitionTime":"2025-12-03T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.920260 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.920312 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.920320 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.920334 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:10 crc kubenswrapper[4475]: I1203 06:46:10.920344 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:10Z","lastTransitionTime":"2025-12-03T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.022484 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.022515 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.022523 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.022534 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.022542 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:11Z","lastTransitionTime":"2025-12-03T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.124625 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.124662 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.124672 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.124684 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.124692 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:11Z","lastTransitionTime":"2025-12-03T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.226200 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.226232 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.226240 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.226253 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.226264 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:11Z","lastTransitionTime":"2025-12-03T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.328149 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.328173 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.328181 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.328190 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.328197 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:11Z","lastTransitionTime":"2025-12-03T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.430003 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.430046 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.430056 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.430067 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.430074 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:11Z","lastTransitionTime":"2025-12-03T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.491098 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:11 crc kubenswrapper[4475]: E1203 06:46:11.491190 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.531807 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.531843 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.531854 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.531867 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.531876 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:11Z","lastTransitionTime":"2025-12-03T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.633820 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.633878 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.633887 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.633900 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.633909 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:11Z","lastTransitionTime":"2025-12-03T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.735882 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.735939 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.735950 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.735963 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.735972 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:11Z","lastTransitionTime":"2025-12-03T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.837435 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.837488 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.837497 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.837509 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.837516 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:11Z","lastTransitionTime":"2025-12-03T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.938867 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.938898 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.938906 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.938918 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:11 crc kubenswrapper[4475]: I1203 06:46:11.938928 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:11Z","lastTransitionTime":"2025-12-03T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.040183 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.040220 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.040228 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.040241 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.040248 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:12Z","lastTransitionTime":"2025-12-03T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.141999 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.142035 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.142045 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.142064 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.142074 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:12Z","lastTransitionTime":"2025-12-03T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.244572 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.244598 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.244607 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.244617 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.244624 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:12Z","lastTransitionTime":"2025-12-03T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.346475 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.346495 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.346502 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.346511 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.346517 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:12Z","lastTransitionTime":"2025-12-03T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.448687 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.448762 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.448775 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.448788 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.448796 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:12Z","lastTransitionTime":"2025-12-03T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.490353 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:12 crc kubenswrapper[4475]: E1203 06:46:12.490465 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.490599 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.490628 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:12 crc kubenswrapper[4475]: E1203 06:46:12.490687 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:12 crc kubenswrapper[4475]: E1203 06:46:12.490755 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.550427 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.550462 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.550471 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.550480 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.550493 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:12Z","lastTransitionTime":"2025-12-03T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.652567 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.652605 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.652613 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.652627 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.652636 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:12Z","lastTransitionTime":"2025-12-03T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.754374 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.754402 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.754422 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.754434 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.754443 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:12Z","lastTransitionTime":"2025-12-03T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.856808 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.856841 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.856849 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.856861 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.856870 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:12Z","lastTransitionTime":"2025-12-03T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.959130 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.959167 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.959176 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.959190 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:12 crc kubenswrapper[4475]: I1203 06:46:12.959200 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:12Z","lastTransitionTime":"2025-12-03T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.061346 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.061374 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.061384 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.061395 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.061403 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:13Z","lastTransitionTime":"2025-12-03T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.162790 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.162825 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.162834 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.162846 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.162854 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:13Z","lastTransitionTime":"2025-12-03T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.264502 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.264526 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.264534 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.264543 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.264550 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:13Z","lastTransitionTime":"2025-12-03T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.365858 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.365899 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.365907 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.365926 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.365936 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:13Z","lastTransitionTime":"2025-12-03T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.467761 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.467813 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.467821 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.467835 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.467843 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:13Z","lastTransitionTime":"2025-12-03T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.491851 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:13 crc kubenswrapper[4475]: E1203 06:46:13.492034 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.569397 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.569472 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.569483 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.569493 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.569501 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:13Z","lastTransitionTime":"2025-12-03T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.671092 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.671121 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.671132 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.671143 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.671151 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:13Z","lastTransitionTime":"2025-12-03T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.772758 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.772824 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.772834 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.772848 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.772857 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:13Z","lastTransitionTime":"2025-12-03T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.874522 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.874544 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.874552 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.874562 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.874568 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:13Z","lastTransitionTime":"2025-12-03T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.975812 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.975837 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.975846 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.975872 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:13 crc kubenswrapper[4475]: I1203 06:46:13.975879 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:13Z","lastTransitionTime":"2025-12-03T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.077241 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.077268 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.077275 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.077284 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.077307 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:14Z","lastTransitionTime":"2025-12-03T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.178707 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.178740 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.178750 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.178763 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.178771 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:14Z","lastTransitionTime":"2025-12-03T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.280628 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.280672 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.280681 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.280693 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.280701 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:14Z","lastTransitionTime":"2025-12-03T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.382284 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.382312 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.382325 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.382338 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.382348 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:14Z","lastTransitionTime":"2025-12-03T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.485075 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.485126 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.485135 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.485148 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.485156 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:14Z","lastTransitionTime":"2025-12-03T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.490295 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.490313 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.490313 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:14 crc kubenswrapper[4475]: E1203 06:46:14.490382 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:14 crc kubenswrapper[4475]: E1203 06:46:14.490496 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:14 crc kubenswrapper[4475]: E1203 06:46:14.490542 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.587349 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.587378 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.587387 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.587397 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.587405 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:14Z","lastTransitionTime":"2025-12-03T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.688871 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.688923 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.688932 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.688945 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.688954 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:14Z","lastTransitionTime":"2025-12-03T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.790813 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.790846 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.790854 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.790866 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.790875 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:14Z","lastTransitionTime":"2025-12-03T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.892300 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.892338 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.892349 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.892362 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.892371 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:14Z","lastTransitionTime":"2025-12-03T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.994789 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.994820 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.994831 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.994842 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:14 crc kubenswrapper[4475]: I1203 06:46:14.994851 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:14Z","lastTransitionTime":"2025-12-03T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.096488 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.096521 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.096530 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.096539 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.096546 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:15Z","lastTransitionTime":"2025-12-03T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.198715 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.198749 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.198758 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.198770 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.198778 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:15Z","lastTransitionTime":"2025-12-03T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.300712 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.300737 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.300745 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.300773 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.300782 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:15Z","lastTransitionTime":"2025-12-03T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.402777 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.402800 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.402807 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.402817 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.402825 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:15Z","lastTransitionTime":"2025-12-03T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.490367 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:15 crc kubenswrapper[4475]: E1203 06:46:15.490506 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.503766 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.503860 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.503930 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.503989 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.504039 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:15Z","lastTransitionTime":"2025-12-03T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.505678 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.512739 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.519297 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.528999 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.541281 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:08Z\\\",\\\"message\\\":\\\"gins-k9cmc\\\\nI1203 06:46:08.075193 6119 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075199 6119 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-k9cmc in node crc\\\\nI1203 06:46:08.075203 6119 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc after 0 failed attempt(s)\\\\nI1203 06:46:08.075207 6119 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075218 6119 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 06:46:08.075238 6119 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}\\\\nI1203 06:46:08.075253 6119 services_controller.go:360] Finished syncing service redhat-marketplace on namespace openshift-marketplace for network=default : 1.255581ms\\\\nF1203 06:46:08.075260 6119 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:46:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.549007 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.557020 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.564614 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.572071 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.579983 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.586758 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.594528 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.603581 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.605666 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.605692 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.605701 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.605712 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.605719 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:15Z","lastTransitionTime":"2025-12-03T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.611631 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.619422 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.626944 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.633901 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.640695 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:15Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.708188 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.708216 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.708226 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.708238 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.708246 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:15Z","lastTransitionTime":"2025-12-03T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.810061 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.810204 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.810269 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.810332 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.810384 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:15Z","lastTransitionTime":"2025-12-03T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.911558 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.911590 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.911600 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.911613 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:15 crc kubenswrapper[4475]: I1203 06:46:15.911621 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:15Z","lastTransitionTime":"2025-12-03T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.013713 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.013908 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.013918 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.013929 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.013937 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.115928 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.115960 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.115969 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.115983 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.115994 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.217735 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.217774 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.217786 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.217803 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.217812 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.319838 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.319883 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.319893 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.319905 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.319914 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.421775 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.421805 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.421813 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.421850 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.421858 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.490588 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.490588 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:16 crc kubenswrapper[4475]: E1203 06:46:16.490687 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:16 crc kubenswrapper[4475]: E1203 06:46:16.490745 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.490600 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:16 crc kubenswrapper[4475]: E1203 06:46:16.490794 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.523466 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.523492 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.523500 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.523510 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.523519 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.590891 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.590916 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.590926 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.590936 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.590944 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: E1203 06:46:16.600144 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:16Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.602513 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.602534 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.602563 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.602574 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.602581 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: E1203 06:46:16.613083 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:16Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.615926 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.615953 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.615961 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.615970 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.615978 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: E1203 06:46:16.623289 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:16Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.625371 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.625495 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.625571 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.625655 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.625728 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: E1203 06:46:16.633490 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:16Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.635745 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.635770 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.635779 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.635790 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.635799 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: E1203 06:46:16.643773 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:16Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:16 crc kubenswrapper[4475]: E1203 06:46:16.643883 4475 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.644876 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.644900 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.644910 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.644920 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.644928 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.747013 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.747040 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.747048 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.747060 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.747068 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.848530 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.848564 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.848573 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.848587 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.848598 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.950530 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.950864 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.950930 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.950999 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:16 crc kubenswrapper[4475]: I1203 06:46:16.951050 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:16Z","lastTransitionTime":"2025-12-03T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.053025 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.053051 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.053059 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.053070 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.053078 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:17Z","lastTransitionTime":"2025-12-03T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.154086 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.154110 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.154118 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.154129 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.154137 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:17Z","lastTransitionTime":"2025-12-03T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.256057 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.256078 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.256087 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.256096 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.256104 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:17Z","lastTransitionTime":"2025-12-03T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.357989 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.358017 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.358025 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.358037 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.358044 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:17Z","lastTransitionTime":"2025-12-03T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.459365 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.459391 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.459400 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.459412 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.459420 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:17Z","lastTransitionTime":"2025-12-03T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.492047 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:17 crc kubenswrapper[4475]: E1203 06:46:17.492127 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.561720 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.561743 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.561750 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.561760 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.561769 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:17Z","lastTransitionTime":"2025-12-03T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.663021 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.663042 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.663050 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.663059 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.663067 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:17Z","lastTransitionTime":"2025-12-03T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.764290 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.764312 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.764320 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.764328 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.764335 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:17Z","lastTransitionTime":"2025-12-03T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.865438 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.865564 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.865711 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.865836 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.865951 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:17Z","lastTransitionTime":"2025-12-03T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.966967 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.966995 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.967004 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.967017 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:17 crc kubenswrapper[4475]: I1203 06:46:17.967025 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:17Z","lastTransitionTime":"2025-12-03T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.068762 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.068787 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.068796 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.068805 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.068813 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:18Z","lastTransitionTime":"2025-12-03T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.170602 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.170701 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.170764 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.170826 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.170877 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:18Z","lastTransitionTime":"2025-12-03T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.272029 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.272143 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.272281 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.272408 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.272558 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:18Z","lastTransitionTime":"2025-12-03T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.374335 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.374374 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.374382 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.374391 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.374398 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:18Z","lastTransitionTime":"2025-12-03T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.475481 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.475643 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.475715 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.475771 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.475858 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:18Z","lastTransitionTime":"2025-12-03T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.490637 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.490830 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:18 crc kubenswrapper[4475]: E1203 06:46:18.491031 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:18 crc kubenswrapper[4475]: E1203 06:46:18.490776 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.490900 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:18 crc kubenswrapper[4475]: E1203 06:46:18.491398 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.577814 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.578002 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.578059 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.578113 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.578187 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:18Z","lastTransitionTime":"2025-12-03T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.679504 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.679541 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.679551 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.679564 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.679573 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:18Z","lastTransitionTime":"2025-12-03T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.781283 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.781311 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.781321 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.781333 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.781341 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:18Z","lastTransitionTime":"2025-12-03T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.882931 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.882954 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.882963 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.882973 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.882980 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:18Z","lastTransitionTime":"2025-12-03T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.984065 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.984091 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.984100 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.984112 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:18 crc kubenswrapper[4475]: I1203 06:46:18.984120 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:18Z","lastTransitionTime":"2025-12-03T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.085559 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.085587 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.085595 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.085609 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.085617 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:19Z","lastTransitionTime":"2025-12-03T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.187016 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.187045 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.187054 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.187065 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.187073 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:19Z","lastTransitionTime":"2025-12-03T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.288579 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.288603 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.288611 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.288623 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.288631 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:19Z","lastTransitionTime":"2025-12-03T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.390056 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.390089 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.390098 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.390110 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.390119 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:19Z","lastTransitionTime":"2025-12-03T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.491151 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:19 crc kubenswrapper[4475]: E1203 06:46:19.491232 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.492807 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.492829 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.492836 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.492845 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.492858 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:19Z","lastTransitionTime":"2025-12-03T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.594241 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.594268 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.594277 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.594288 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.594295 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:19Z","lastTransitionTime":"2025-12-03T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.695629 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.695655 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.695663 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.695673 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.695680 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:19Z","lastTransitionTime":"2025-12-03T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.796970 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.796995 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.797003 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.797013 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.797019 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:19Z","lastTransitionTime":"2025-12-03T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.844400 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:19 crc kubenswrapper[4475]: E1203 06:46:19.844508 4475 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:46:19 crc kubenswrapper[4475]: E1203 06:46:19.844549 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs podName:7e9dd470-572a-4396-9be7-1a37e3c48977 nodeName:}" failed. No retries permitted until 2025-12-03 06:46:51.844537152 +0000 UTC m=+96.649435485 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs") pod "network-metrics-daemon-hq2rn" (UID: "7e9dd470-572a-4396-9be7-1a37e3c48977") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.898260 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.898290 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.898301 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.898313 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.898323 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:19Z","lastTransitionTime":"2025-12-03T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:19 crc kubenswrapper[4475]: I1203 06:46:19.999900 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:19.999922 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:19.999931 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:19.999940 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:19.999974 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:19Z","lastTransitionTime":"2025-12-03T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.101645 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.101675 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.101685 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.101698 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.101706 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:20Z","lastTransitionTime":"2025-12-03T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.203505 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.203536 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.203545 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.203555 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.203564 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:20Z","lastTransitionTime":"2025-12-03T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.305205 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.305231 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.305251 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.305262 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.305270 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:20Z","lastTransitionTime":"2025-12-03T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.407120 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.407147 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.407156 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.407168 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.407177 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:20Z","lastTransitionTime":"2025-12-03T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.490513 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.490531 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.490526 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:20 crc kubenswrapper[4475]: E1203 06:46:20.490616 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:20 crc kubenswrapper[4475]: E1203 06:46:20.490701 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:20 crc kubenswrapper[4475]: E1203 06:46:20.490749 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.508443 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.508902 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.508914 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.508926 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.508935 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:20Z","lastTransitionTime":"2025-12-03T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.610538 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.610565 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.610574 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.610586 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.610594 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:20Z","lastTransitionTime":"2025-12-03T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.712621 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.712647 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.712655 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.712714 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.712724 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:20Z","lastTransitionTime":"2025-12-03T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.814796 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.814823 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.814831 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.814840 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.814848 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:20Z","lastTransitionTime":"2025-12-03T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.916364 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.916391 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.916399 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.916411 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:20 crc kubenswrapper[4475]: I1203 06:46:20.916419 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:20Z","lastTransitionTime":"2025-12-03T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.018660 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.018687 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.018696 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.018706 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.018713 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:21Z","lastTransitionTime":"2025-12-03T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.120576 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.120604 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.120629 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.120642 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.120650 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:21Z","lastTransitionTime":"2025-12-03T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.222154 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.222186 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.222194 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.222206 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.222215 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:21Z","lastTransitionTime":"2025-12-03T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.323246 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.323431 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.323551 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.323637 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.323718 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:21Z","lastTransitionTime":"2025-12-03T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.425897 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.425934 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.425944 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.425957 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.425966 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:21Z","lastTransitionTime":"2025-12-03T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.490574 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:21 crc kubenswrapper[4475]: E1203 06:46:21.490658 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.528044 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.528072 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.528082 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.528092 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.528099 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:21Z","lastTransitionTime":"2025-12-03T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.629687 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.629715 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.629723 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.629735 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.629744 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:21Z","lastTransitionTime":"2025-12-03T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.731038 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.731064 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.731072 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.731082 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.731090 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:21Z","lastTransitionTime":"2025-12-03T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.738674 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9b2j8_f3a17c67-95e0-4889-8a30-64c08b6720f4/kube-multus/0.log" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.738781 4475 generic.go:334] "Generic (PLEG): container finished" podID="f3a17c67-95e0-4889-8a30-64c08b6720f4" containerID="d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca" exitCode=1 Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.738805 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9b2j8" event={"ID":"f3a17c67-95e0-4889-8a30-64c08b6720f4","Type":"ContainerDied","Data":"d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca"} Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.739282 4475 scope.go:117] "RemoveContainer" containerID="d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.752235 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.759936 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.767819 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.775466 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.782084 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.792006 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:20Z\\\",\\\"message\\\":\\\"2025-12-03T06:45:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9\\\\n2025-12-03T06:45:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9 to /host/opt/cni/bin/\\\\n2025-12-03T06:45:35Z [verbose] multus-daemon started\\\\n2025-12-03T06:45:35Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:46:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.803277 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:08Z\\\",\\\"message\\\":\\\"gins-k9cmc\\\\nI1203 06:46:08.075193 6119 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075199 6119 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-k9cmc in node crc\\\\nI1203 06:46:08.075203 6119 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc after 0 failed attempt(s)\\\\nI1203 06:46:08.075207 6119 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075218 6119 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 06:46:08.075238 6119 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}\\\\nI1203 06:46:08.075253 6119 services_controller.go:360] Finished syncing service redhat-marketplace on namespace openshift-marketplace for network=default : 1.255581ms\\\\nF1203 06:46:08.075260 6119 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:46:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.810500 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.819542 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.827349 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.832704 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.832732 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.832742 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.832753 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.832762 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:21Z","lastTransitionTime":"2025-12-03T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.834607 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.841992 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.854080 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.861893 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.870022 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.879137 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.886835 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.894251 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.934834 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.934860 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.934868 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.934880 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:21 crc kubenswrapper[4475]: I1203 06:46:21.934888 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:21Z","lastTransitionTime":"2025-12-03T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.035931 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.035963 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.035972 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.035983 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.035990 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:22Z","lastTransitionTime":"2025-12-03T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.137443 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.137500 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.137509 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.137521 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.137530 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:22Z","lastTransitionTime":"2025-12-03T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.239085 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.239133 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.239143 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.239157 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.239165 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:22Z","lastTransitionTime":"2025-12-03T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.340573 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.340606 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.340615 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.340626 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.340634 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:22Z","lastTransitionTime":"2025-12-03T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.442315 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.442341 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.442352 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.442363 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.442372 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:22Z","lastTransitionTime":"2025-12-03T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.491060 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.491148 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:22 crc kubenswrapper[4475]: E1203 06:46:22.491257 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.491289 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:22 crc kubenswrapper[4475]: E1203 06:46:22.491366 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:22 crc kubenswrapper[4475]: E1203 06:46:22.491419 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.491893 4475 scope.go:117] "RemoveContainer" containerID="dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e" Dec 03 06:46:22 crc kubenswrapper[4475]: E1203 06:46:22.492018 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.543614 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.543644 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.543654 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.543665 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.543674 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:22Z","lastTransitionTime":"2025-12-03T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.645574 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.645615 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.645624 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.645636 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.645644 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:22Z","lastTransitionTime":"2025-12-03T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.742003 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9b2j8_f3a17c67-95e0-4889-8a30-64c08b6720f4/kube-multus/0.log" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.742040 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9b2j8" event={"ID":"f3a17c67-95e0-4889-8a30-64c08b6720f4","Type":"ContainerStarted","Data":"4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685"} Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.746636 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.746663 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.746672 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.746682 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.746689 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:22Z","lastTransitionTime":"2025-12-03T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.751718 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.760394 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.768967 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.776543 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.783719 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.797006 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.803613 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.813048 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.821376 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.829960 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.837207 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.844724 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.847961 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.847990 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.848000 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.848013 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.848021 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:22Z","lastTransitionTime":"2025-12-03T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.851358 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.860123 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:20Z\\\",\\\"message\\\":\\\"2025-12-03T06:45:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9\\\\n2025-12-03T06:45:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9 to /host/opt/cni/bin/\\\\n2025-12-03T06:45:35Z [verbose] multus-daemon started\\\\n2025-12-03T06:45:35Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:46:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.872422 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:08Z\\\",\\\"message\\\":\\\"gins-k9cmc\\\\nI1203 06:46:08.075193 6119 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075199 6119 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-k9cmc in node crc\\\\nI1203 06:46:08.075203 6119 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc after 0 failed attempt(s)\\\\nI1203 06:46:08.075207 6119 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075218 6119 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 06:46:08.075238 6119 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}\\\\nI1203 06:46:08.075253 6119 services_controller.go:360] Finished syncing service redhat-marketplace on namespace openshift-marketplace for network=default : 1.255581ms\\\\nF1203 06:46:08.075260 6119 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:46:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.879264 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.886600 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.895385 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.950013 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.950038 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.950077 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.950089 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:22 crc kubenswrapper[4475]: I1203 06:46:22.950097 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:22Z","lastTransitionTime":"2025-12-03T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.051933 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.051954 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.051965 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.051976 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.051984 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:23Z","lastTransitionTime":"2025-12-03T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.153513 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.153535 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.153543 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.153553 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.153560 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:23Z","lastTransitionTime":"2025-12-03T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.254842 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.254864 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.254872 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.254883 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.254892 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:23Z","lastTransitionTime":"2025-12-03T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.356303 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.356325 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.356335 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.356346 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.356356 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:23Z","lastTransitionTime":"2025-12-03T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.457384 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.457415 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.457424 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.457436 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.457444 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:23Z","lastTransitionTime":"2025-12-03T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.491170 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:23 crc kubenswrapper[4475]: E1203 06:46:23.491250 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.558727 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.558759 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.558768 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.558778 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.558785 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:23Z","lastTransitionTime":"2025-12-03T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.660154 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.660184 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.660192 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.660202 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.660210 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:23Z","lastTransitionTime":"2025-12-03T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.762231 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.762263 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.762272 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.762286 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.762294 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:23Z","lastTransitionTime":"2025-12-03T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.863909 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.863940 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.863949 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.863961 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.863969 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:23Z","lastTransitionTime":"2025-12-03T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.965569 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.965601 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.965610 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.965621 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:23 crc kubenswrapper[4475]: I1203 06:46:23.965629 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:23Z","lastTransitionTime":"2025-12-03T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.066740 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.066772 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.066781 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.066792 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.066812 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:24Z","lastTransitionTime":"2025-12-03T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.168323 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.168360 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.168369 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.168382 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.168390 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:24Z","lastTransitionTime":"2025-12-03T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.269932 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.269981 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.269991 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.270002 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.270010 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:24Z","lastTransitionTime":"2025-12-03T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.371696 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.371733 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.371742 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.371755 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.371764 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:24Z","lastTransitionTime":"2025-12-03T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.473572 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.473608 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.473617 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.473630 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.473639 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:24Z","lastTransitionTime":"2025-12-03T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.491129 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.491146 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:24 crc kubenswrapper[4475]: E1203 06:46:24.491307 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:24 crc kubenswrapper[4475]: E1203 06:46:24.491226 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.491154 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:24 crc kubenswrapper[4475]: E1203 06:46:24.491383 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.575581 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.575617 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.575626 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.575638 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.575647 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:24Z","lastTransitionTime":"2025-12-03T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.677494 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.677533 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.677542 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.677554 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.677563 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:24Z","lastTransitionTime":"2025-12-03T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.779276 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.779398 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.779525 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.779606 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.779669 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:24Z","lastTransitionTime":"2025-12-03T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.881043 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.881076 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.881086 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.881099 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.881108 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:24Z","lastTransitionTime":"2025-12-03T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.982553 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.982577 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.982586 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.982595 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:24 crc kubenswrapper[4475]: I1203 06:46:24.982602 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:24Z","lastTransitionTime":"2025-12-03T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.083957 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.083985 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.083993 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.084003 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.084011 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:25Z","lastTransitionTime":"2025-12-03T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.185427 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.185479 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.185488 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.185497 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.185505 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:25Z","lastTransitionTime":"2025-12-03T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.286750 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.286774 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.286782 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.286791 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.286798 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:25Z","lastTransitionTime":"2025-12-03T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.388504 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.388546 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.388556 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.388567 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.388576 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:25Z","lastTransitionTime":"2025-12-03T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.490478 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:25 crc kubenswrapper[4475]: E1203 06:46:25.490666 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.490940 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.491014 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.491073 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.491139 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.491203 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:25Z","lastTransitionTime":"2025-12-03T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.502899 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.512241 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.520585 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.529002 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.536600 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.544635 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:20Z\\\",\\\"message\\\":\\\"2025-12-03T06:45:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9\\\\n2025-12-03T06:45:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9 to /host/opt/cni/bin/\\\\n2025-12-03T06:45:35Z [verbose] multus-daemon started\\\\n2025-12-03T06:45:35Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:46:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.559087 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:08Z\\\",\\\"message\\\":\\\"gins-k9cmc\\\\nI1203 06:46:08.075193 6119 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075199 6119 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-k9cmc in node crc\\\\nI1203 06:46:08.075203 6119 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc after 0 failed attempt(s)\\\\nI1203 06:46:08.075207 6119 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075218 6119 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 06:46:08.075238 6119 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}\\\\nI1203 06:46:08.075253 6119 services_controller.go:360] Finished syncing service redhat-marketplace on namespace openshift-marketplace for network=default : 1.255581ms\\\\nF1203 06:46:08.075260 6119 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:46:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.569765 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.576709 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.586703 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.593322 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.593348 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.593356 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.593369 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.593377 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:25Z","lastTransitionTime":"2025-12-03T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.594696 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.605423 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.614163 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.621695 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.628593 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.637091 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.649218 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.655294 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.694800 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.694829 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.694838 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.694850 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.694858 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:25Z","lastTransitionTime":"2025-12-03T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.796168 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.796201 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.796209 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.796225 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.796234 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:25Z","lastTransitionTime":"2025-12-03T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.897401 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.897436 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.897463 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.897477 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.897486 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:25Z","lastTransitionTime":"2025-12-03T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.998809 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.998839 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.998848 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.998860 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:25 crc kubenswrapper[4475]: I1203 06:46:25.998868 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:25Z","lastTransitionTime":"2025-12-03T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.100164 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.100191 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.100199 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.100211 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.100219 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.201736 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.201761 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.201769 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.201779 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.201787 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.302713 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.302746 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.302757 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.302772 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.302781 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.404855 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.404882 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.404891 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.404902 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.404910 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.490844 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:26 crc kubenswrapper[4475]: E1203 06:46:26.491008 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.490873 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:26 crc kubenswrapper[4475]: E1203 06:46:26.491184 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.490857 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:26 crc kubenswrapper[4475]: E1203 06:46:26.491333 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.506314 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.506335 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.506342 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.506351 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.506359 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.607472 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.607616 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.607678 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.607746 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.607802 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.708745 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.708765 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.708772 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.708781 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.708787 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.809988 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.810008 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.810016 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.810025 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.810032 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.911578 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.911597 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.911605 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.911613 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.911621 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.977793 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.977944 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.978024 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.978096 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.978169 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: E1203 06:46:26.986446 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.988847 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.988981 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.989047 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.989106 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.989167 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:26 crc kubenswrapper[4475]: E1203 06:46:26.997358 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.999717 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.999741 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.999749 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.999759 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:26 crc kubenswrapper[4475]: I1203 06:46:26.999767 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:26Z","lastTransitionTime":"2025-12-03T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: E1203 06:46:27.007314 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.010080 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.010101 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.010108 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.010117 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.010125 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: E1203 06:46:27.017489 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.019363 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.019396 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.019405 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.019419 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.019428 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: E1203 06:46:27.026865 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:27 crc kubenswrapper[4475]: E1203 06:46:27.027090 4475 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.027997 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.028033 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.028043 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.028052 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.028060 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.132167 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.132316 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.132326 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.132337 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.132344 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.234216 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.234364 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.234429 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.234631 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.234711 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.336586 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.336613 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.336621 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.336631 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.336638 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.438723 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.438762 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.438770 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.438781 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.438788 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.490517 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:27 crc kubenswrapper[4475]: E1203 06:46:27.490609 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.540472 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.540493 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.540502 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.540531 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.540539 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.641950 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.641979 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.641987 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.641997 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.642006 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.743409 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.743576 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.743643 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.743706 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.743773 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.845031 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.845055 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.845065 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.845075 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.845082 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.946440 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.946483 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.946493 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.946505 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:27 crc kubenswrapper[4475]: I1203 06:46:27.946513 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:27Z","lastTransitionTime":"2025-12-03T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.048002 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.048029 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.048037 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.048048 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.048061 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:28Z","lastTransitionTime":"2025-12-03T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.149757 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.149784 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.149793 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.149802 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.149810 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:28Z","lastTransitionTime":"2025-12-03T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.251938 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.251965 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.251974 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.251984 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.251991 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:28Z","lastTransitionTime":"2025-12-03T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.353417 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.353446 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.353470 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.353483 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.353492 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:28Z","lastTransitionTime":"2025-12-03T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.454738 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.454760 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.454767 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.454777 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.454783 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:28Z","lastTransitionTime":"2025-12-03T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.490253 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:28 crc kubenswrapper[4475]: E1203 06:46:28.490328 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.490343 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.490375 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:28 crc kubenswrapper[4475]: E1203 06:46:28.490434 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:28 crc kubenswrapper[4475]: E1203 06:46:28.490577 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.556352 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.556484 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.556588 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.556658 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.556711 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:28Z","lastTransitionTime":"2025-12-03T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.658180 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.658376 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.658507 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.658592 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.658656 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:28Z","lastTransitionTime":"2025-12-03T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.759883 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.759914 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.759923 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.759937 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.759944 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:28Z","lastTransitionTime":"2025-12-03T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.862074 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.862179 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.862242 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.862301 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.862366 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:28Z","lastTransitionTime":"2025-12-03T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.964297 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.964511 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.964590 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.964678 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:28 crc kubenswrapper[4475]: I1203 06:46:28.964737 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:28Z","lastTransitionTime":"2025-12-03T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.066661 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.066683 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.066691 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.066701 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.066709 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:29Z","lastTransitionTime":"2025-12-03T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.168672 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.168708 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.168717 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.168727 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.168733 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:29Z","lastTransitionTime":"2025-12-03T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.270077 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.270138 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.270149 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.270164 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.270178 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:29Z","lastTransitionTime":"2025-12-03T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.371827 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.371883 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.371892 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.371904 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.371912 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:29Z","lastTransitionTime":"2025-12-03T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.473764 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.473788 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.473797 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.473808 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.473816 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:29Z","lastTransitionTime":"2025-12-03T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.490544 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:29 crc kubenswrapper[4475]: E1203 06:46:29.490643 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.575327 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.575354 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.575364 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.575373 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.575380 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:29Z","lastTransitionTime":"2025-12-03T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.676838 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.676865 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.676873 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.676883 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.676890 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:29Z","lastTransitionTime":"2025-12-03T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.778084 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.778110 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.778120 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.778141 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.778154 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:29Z","lastTransitionTime":"2025-12-03T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.879839 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.879886 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.879895 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.879904 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.879911 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:29Z","lastTransitionTime":"2025-12-03T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.981734 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.981806 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.981832 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.981847 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:29 crc kubenswrapper[4475]: I1203 06:46:29.981856 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:29Z","lastTransitionTime":"2025-12-03T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.083354 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.083382 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.083390 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.083400 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.083409 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:30Z","lastTransitionTime":"2025-12-03T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.185299 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.185325 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.185334 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.185344 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.185351 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:30Z","lastTransitionTime":"2025-12-03T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.287296 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.287318 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.287325 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.287336 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.287344 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:30Z","lastTransitionTime":"2025-12-03T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.388585 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.388613 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.388640 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.388653 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.388661 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:30Z","lastTransitionTime":"2025-12-03T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.490324 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.490341 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.490367 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:30 crc kubenswrapper[4475]: E1203 06:46:30.490401 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:30 crc kubenswrapper[4475]: E1203 06:46:30.490478 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:30 crc kubenswrapper[4475]: E1203 06:46:30.490562 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.490866 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.490902 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.490912 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.490926 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.490933 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:30Z","lastTransitionTime":"2025-12-03T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.592938 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.592964 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.592972 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.592981 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.593003 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:30Z","lastTransitionTime":"2025-12-03T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.694870 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.694892 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.694899 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.694909 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.694917 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:30Z","lastTransitionTime":"2025-12-03T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.796183 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.796225 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.796235 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.796250 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.796261 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:30Z","lastTransitionTime":"2025-12-03T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.897386 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.897412 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.897420 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.897431 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.897440 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:30Z","lastTransitionTime":"2025-12-03T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.998882 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.998902 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.998910 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.998919 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:30 crc kubenswrapper[4475]: I1203 06:46:30.998926 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:30Z","lastTransitionTime":"2025-12-03T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.101071 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.101097 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.101104 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.101118 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.101126 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:31Z","lastTransitionTime":"2025-12-03T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.203259 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.203282 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.203290 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.203300 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.203308 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:31Z","lastTransitionTime":"2025-12-03T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.305373 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.305490 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.305575 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.305668 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.305761 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:31Z","lastTransitionTime":"2025-12-03T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.407018 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.407133 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.407211 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.407279 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.407333 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:31Z","lastTransitionTime":"2025-12-03T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.490936 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:31 crc kubenswrapper[4475]: E1203 06:46:31.491088 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.509293 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.509313 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.509321 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.509329 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.509335 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:31Z","lastTransitionTime":"2025-12-03T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.611151 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.611249 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.611322 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.611391 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.611473 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:31Z","lastTransitionTime":"2025-12-03T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.712788 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.712923 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.713002 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.713071 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.713146 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:31Z","lastTransitionTime":"2025-12-03T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.814488 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.814529 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.814549 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.814558 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.814565 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:31Z","lastTransitionTime":"2025-12-03T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.916425 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.916478 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.916488 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.916499 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:31 crc kubenswrapper[4475]: I1203 06:46:31.916507 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:31Z","lastTransitionTime":"2025-12-03T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.017689 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.017713 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.017721 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.017730 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.017736 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:32Z","lastTransitionTime":"2025-12-03T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.119356 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.119376 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.119384 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.119393 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.119401 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:32Z","lastTransitionTime":"2025-12-03T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.221210 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.221228 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.221235 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.221243 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.221251 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:32Z","lastTransitionTime":"2025-12-03T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.322833 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.322869 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.322878 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.322898 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.322907 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:32Z","lastTransitionTime":"2025-12-03T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.424917 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.424962 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.424971 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.424980 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.424987 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:32Z","lastTransitionTime":"2025-12-03T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.490748 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.490768 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.490770 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:32 crc kubenswrapper[4475]: E1203 06:46:32.490835 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:32 crc kubenswrapper[4475]: E1203 06:46:32.490944 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:32 crc kubenswrapper[4475]: E1203 06:46:32.490975 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.526595 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.526619 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.526629 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.526640 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.526648 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:32Z","lastTransitionTime":"2025-12-03T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.628842 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.628877 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.628886 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.628904 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.628913 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:32Z","lastTransitionTime":"2025-12-03T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.732957 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.733022 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.733032 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.733045 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.733056 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:32Z","lastTransitionTime":"2025-12-03T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.834681 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.834705 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.834731 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.834741 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.834753 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:32Z","lastTransitionTime":"2025-12-03T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.935915 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.935939 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.935947 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.935956 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:32 crc kubenswrapper[4475]: I1203 06:46:32.935980 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:32Z","lastTransitionTime":"2025-12-03T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.037819 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.037846 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.037855 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.037865 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.037874 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:33Z","lastTransitionTime":"2025-12-03T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.139203 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.139250 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.139259 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.139268 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.139276 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:33Z","lastTransitionTime":"2025-12-03T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.240729 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.240754 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.240762 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.240771 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.240778 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:33Z","lastTransitionTime":"2025-12-03T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.342803 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.342850 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.342858 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.342872 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.342880 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:33Z","lastTransitionTime":"2025-12-03T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.444699 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.444722 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.444745 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.444757 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.444764 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:33Z","lastTransitionTime":"2025-12-03T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.490467 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:33 crc kubenswrapper[4475]: E1203 06:46:33.490571 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.545919 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.545943 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.545950 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.545975 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.545984 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:33Z","lastTransitionTime":"2025-12-03T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.647608 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.647630 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.647638 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.647648 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.647656 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:33Z","lastTransitionTime":"2025-12-03T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.749352 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.749385 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.749395 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.749408 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.749436 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:33Z","lastTransitionTime":"2025-12-03T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.851262 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.851292 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.851301 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.851312 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.851320 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:33Z","lastTransitionTime":"2025-12-03T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.953601 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.953802 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.953879 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.953945 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:33 crc kubenswrapper[4475]: I1203 06:46:33.954012 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:33Z","lastTransitionTime":"2025-12-03T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.056001 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.056032 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.056041 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.056053 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.056061 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:34Z","lastTransitionTime":"2025-12-03T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.157888 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.157908 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.157916 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.157925 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.157932 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:34Z","lastTransitionTime":"2025-12-03T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.259973 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.260010 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.260018 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.260033 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.260042 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:34Z","lastTransitionTime":"2025-12-03T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.362103 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.362147 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.362160 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.362175 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.362194 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:34Z","lastTransitionTime":"2025-12-03T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.463837 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.463863 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.463873 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.463884 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.463891 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:34Z","lastTransitionTime":"2025-12-03T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.490709 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:34 crc kubenswrapper[4475]: E1203 06:46:34.490788 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.490828 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:34 crc kubenswrapper[4475]: E1203 06:46:34.490862 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.490889 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:34 crc kubenswrapper[4475]: E1203 06:46:34.490921 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.565495 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.565527 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.565536 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.565547 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.565569 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:34Z","lastTransitionTime":"2025-12-03T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.667073 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.667101 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.667109 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.667117 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.667127 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:34Z","lastTransitionTime":"2025-12-03T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.768307 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.768422 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.768508 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.768582 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.768653 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:34Z","lastTransitionTime":"2025-12-03T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.869972 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.870005 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.870014 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.870027 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.870035 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:34Z","lastTransitionTime":"2025-12-03T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.971172 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.971197 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.971204 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.971214 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:34 crc kubenswrapper[4475]: I1203 06:46:34.971220 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:34Z","lastTransitionTime":"2025-12-03T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.073210 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.073243 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.073252 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.073265 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.073275 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:35Z","lastTransitionTime":"2025-12-03T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.174938 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.174965 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.174973 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.174984 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.175009 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:35Z","lastTransitionTime":"2025-12-03T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.276254 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.276283 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.276292 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.276304 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.276312 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:35Z","lastTransitionTime":"2025-12-03T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.378371 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.378400 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.378409 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.378419 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.378427 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:35Z","lastTransitionTime":"2025-12-03T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.479861 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.479891 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.479899 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.479910 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.479920 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:35Z","lastTransitionTime":"2025-12-03T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.490537 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:35 crc kubenswrapper[4475]: E1203 06:46:35.490654 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.491137 4475 scope.go:117] "RemoveContainer" containerID="dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.499831 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.509446 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:20Z\\\",\\\"message\\\":\\\"2025-12-03T06:45:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9\\\\n2025-12-03T06:45:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9 to /host/opt/cni/bin/\\\\n2025-12-03T06:45:35Z [verbose] multus-daemon started\\\\n2025-12-03T06:45:35Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:46:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.521999 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:08Z\\\",\\\"message\\\":\\\"gins-k9cmc\\\\nI1203 06:46:08.075193 6119 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075199 6119 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-k9cmc in node crc\\\\nI1203 06:46:08.075203 6119 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc after 0 failed attempt(s)\\\\nI1203 06:46:08.075207 6119 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075218 6119 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 06:46:08.075238 6119 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}\\\\nI1203 06:46:08.075253 6119 services_controller.go:360] Finished syncing service redhat-marketplace on namespace openshift-marketplace for network=default : 1.255581ms\\\\nF1203 06:46:08.075260 6119 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:46:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.531126 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.539758 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.548176 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.558590 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.566676 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.575048 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.581486 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.581511 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.581520 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.581531 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.581540 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:35Z","lastTransitionTime":"2025-12-03T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.584008 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.594306 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.602714 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.615393 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.623220 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.630500 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.637740 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.650396 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.656656 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.684134 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.684158 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.684166 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.684177 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.684186 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:35Z","lastTransitionTime":"2025-12-03T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.766872 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/2.log" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.771507 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141"} Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.771875 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.783704 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.785596 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.785631 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.785640 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.785656 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.785668 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:35Z","lastTransitionTime":"2025-12-03T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.791845 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.802438 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.812207 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.820916 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.827636 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.840386 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.849408 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.857359 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.866025 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.873329 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.882050 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:20Z\\\",\\\"message\\\":\\\"2025-12-03T06:45:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9\\\\n2025-12-03T06:45:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9 to /host/opt/cni/bin/\\\\n2025-12-03T06:45:35Z [verbose] multus-daemon started\\\\n2025-12-03T06:45:35Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:46:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.887112 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.887138 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.887146 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.887159 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.887166 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:35Z","lastTransitionTime":"2025-12-03T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.894858 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:08Z\\\",\\\"message\\\":\\\"gins-k9cmc\\\\nI1203 06:46:08.075193 6119 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075199 6119 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-k9cmc in node crc\\\\nI1203 06:46:08.075203 6119 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc after 0 failed attempt(s)\\\\nI1203 06:46:08.075207 6119 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075218 6119 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 06:46:08.075238 6119 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}\\\\nI1203 06:46:08.075253 6119 services_controller.go:360] Finished syncing service redhat-marketplace on namespace openshift-marketplace for network=default : 1.255581ms\\\\nF1203 06:46:08.075260 6119 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:46:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.903572 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.912677 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.922256 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.931000 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.942856 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.989037 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.989067 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.989076 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.989088 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:35 crc kubenswrapper[4475]: I1203 06:46:35.989114 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:35Z","lastTransitionTime":"2025-12-03T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.090908 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.090938 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.090947 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.090958 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.090968 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:36Z","lastTransitionTime":"2025-12-03T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.192492 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.192612 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.192680 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.192760 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.192822 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:36Z","lastTransitionTime":"2025-12-03T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.268510 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.268604 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.268656 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.268678 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.268725 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268752 4475 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268806 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:40.268790292 +0000 UTC m=+145.073688636 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268813 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268833 4475 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268839 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268853 4475 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268892 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:40.268879921 +0000 UTC m=+145.073778255 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268907 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:40.268902032 +0000 UTC m=+145.073800367 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268924 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268933 4475 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268940 4475 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.268968 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:40.268954692 +0000 UTC m=+145.073853026 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.269202 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:40.269185828 +0000 UTC m=+145.074084162 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.294671 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.295095 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.295152 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.295215 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.295266 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:36Z","lastTransitionTime":"2025-12-03T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.397268 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.397297 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.397307 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.397319 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.397327 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:36Z","lastTransitionTime":"2025-12-03T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.490313 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.490398 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.490551 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.490617 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.490666 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.490775 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.498908 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.498954 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.498964 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.498978 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.498986 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:36Z","lastTransitionTime":"2025-12-03T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.600146 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.600252 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.600331 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.600391 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.600466 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:36Z","lastTransitionTime":"2025-12-03T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.702098 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.702212 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.702297 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.702360 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.702425 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:36Z","lastTransitionTime":"2025-12-03T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.775510 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/3.log" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.776033 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/2.log" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.778039 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" exitCode=1 Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.778069 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141"} Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.778114 4475 scope.go:117] "RemoveContainer" containerID="dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.778435 4475 scope.go:117] "RemoveContainer" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" Dec 03 06:46:36 crc kubenswrapper[4475]: E1203 06:46:36.778581 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.788358 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.799191 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.803788 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.803818 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.803827 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.803838 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.803846 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:36Z","lastTransitionTime":"2025-12-03T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.809149 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.817659 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.826232 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.834189 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.842096 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.856359 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.863278 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.869776 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.879966 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:20Z\\\",\\\"message\\\":\\\"2025-12-03T06:45:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9\\\\n2025-12-03T06:45:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9 to /host/opt/cni/bin/\\\\n2025-12-03T06:45:35Z [verbose] multus-daemon started\\\\n2025-12-03T06:45:35Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:46:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.897140 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbbaea5df5db7406137d9fe054e2abd7fbb765809c6aa804a531d4d0f7c8328e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:08Z\\\",\\\"message\\\":\\\"gins-k9cmc\\\\nI1203 06:46:08.075193 6119 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075199 6119 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-k9cmc in node crc\\\\nI1203 06:46:08.075203 6119 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-k9cmc after 0 failed attempt(s)\\\\nI1203 06:46:08.075207 6119 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-k9cmc\\\\nI1203 06:46:08.075218 6119 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 06:46:08.075238 6119 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}\\\\nI1203 06:46:08.075253 6119 services_controller.go:360] Finished syncing service redhat-marketplace on namespace openshift-marketplace for network=default : 1.255581ms\\\\nF1203 06:46:08.075260 6119 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:46:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:36Z\\\",\\\"message\\\":\\\"model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 06:46:36.095528 6471 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-hq2rn in node crc\\\\nI1203 06:46:36.095538 6471 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-hq2rn] creating logical port openshift-multus_network-metrics-daemon-hq2rn for pod on switch crc\\\\nI1203 06:46:36.095506 6471 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF1203 06:46:36.095556 6471 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.905371 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.906010 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.906031 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.906039 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.906050 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.906059 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:36Z","lastTransitionTime":"2025-12-03T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.914645 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.922932 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.968171 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.980024 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:36 crc kubenswrapper[4475]: I1203 06:46:36.987984 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.007554 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.007598 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.007606 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.007619 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.007627 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.104341 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.104379 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.104388 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.104401 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.104409 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: E1203 06:46:37.112769 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.114809 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.114831 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.114838 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.114850 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.114858 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: E1203 06:46:37.122746 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.124815 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.124841 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.124858 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.124868 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.124875 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: E1203 06:46:37.132781 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.135241 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.135276 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.135286 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.135298 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.135306 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: E1203 06:46:37.142846 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.144818 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.144847 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.144856 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.144866 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.144873 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: E1203 06:46:37.152773 4475 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b860fac6-8533-4b4b-bdad-0cb0561d1495\\\",\\\"systemUUID\\\":\\\"6c3f70a9-a9d8-4b80-a825-7a6426aa17aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: E1203 06:46:37.152872 4475 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.153716 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.153767 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.153780 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.153791 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.153798 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.255446 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.255553 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.255632 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.255699 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.255762 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.357311 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.357349 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.357358 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.357381 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.357401 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.459172 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.459191 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.459199 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.459210 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.459217 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.490962 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:37 crc kubenswrapper[4475]: E1203 06:46:37.491114 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.560869 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.560901 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.560911 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.560924 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.560933 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.662586 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.662614 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.662623 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.662647 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.662656 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.764549 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.764688 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.764762 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.764830 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.764888 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.781235 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/3.log" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.783795 4475 scope.go:117] "RemoveContainer" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" Dec 03 06:46:37 crc kubenswrapper[4475]: E1203 06:46:37.783980 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.793539 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.803608 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7168f008-1b03-40cf-94fa-a71d470454bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://625db083ebf24244e0b28ac937bfa2554497ca35b8f7a1fee0ac739d647c70de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31584b054f88aa7f7e4f1096e2b11acf6f106b7f2e4ced19768808e5df1a6acc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a644e827feb786d7298e41022ef3bc0d2483279c106dddea8e2c7a3c62c3c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742f2f4dc23fff3df8e6d67902ef721b3db1823653b11a69faabdaf8d7650667\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e874d26bf8bc806d74d55a8b9306cc30cca122d2ae0731b0a76ae7ac30450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d644ab44eabce045c9f9b23fab29e574e2f9f49c0cc14b830560996a0ec98880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9935e33badff0b016f8b5a02cb59d8b64451364581023ca3ec8e87fba0aa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k9cmc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.812063 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d2f580-9528-4200-b0a4-797fed1ae972\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.821212 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0af3d80-5aae-4d3b-a974-490687df49f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa848c68a20d5db5c603cafa808518de84e427cbeea4bbc1be31151e6f839b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c05977da6544bc781a279fcddb3279dfee510fdd0a6f4f1a22b8629f17475f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef987b2e9a0fa630edf6d5c06d5f47c5debd1b75d4626aefe7d8ef44bb974eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.829259 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.837068 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91aee7be-4a52-4598-803f-2deebe0674de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f13f644093fd1214d8fb39853857b4113dd7fde64f1a60ff6848fd4c5350f5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvqvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tjbzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.844286 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1df0a77-f3cc-49ab-9fbb-8a4c7608291b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5938dd3c72bee55a3a07312d31a0eaf2df226bb931b07300d71b6e7ff69c905b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4408ad7b7f122c0364b95e0e9761bc28dfb02e7ea00537a70fc031c16b38be6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65wzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sbkp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.857486 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"897f1a97-930a-4c3c-8804-d7cd6006ae9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbb015d3e05f9f94fc225cce6e24bc4a5df0bfc5aaea15fe120e2cc4b8f02902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da747a5ea4f790c71d99693c4bd79a1074f756a20f628fa63e8bad9a713645fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf56315b6ad05ea9af0319db29b919ed0332d2a671c5ba94ea325bd45ef5703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e045b99328661616ea0e44cd50bd394a403836eede05459d117567c191401172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://054b1d2565cc9690152740f71682028595283525344a38ccea66c1f072eae92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e0ad88c2e55994f952b46c2e806792d8fcbd79a901810aef92e46067cc7b92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22796f78d3d551f1ee271ca8581e196f142e70622944154f7d408a88c098f53b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb973559072f07252dcf50bda74d422ea2ed50000c02105381f8d21e5ff9888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.864200 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dqbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ef36226-4b8b-4a7b-a87f-daa9dda6e70b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc78fa3b07b9a5535f697323e9ed322ceefdc8798157160a05eb71017ac3a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wjjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dqbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.866661 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.866691 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.866701 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.866719 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.866727 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.876520 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9b2j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3a17c67-95e0-4889-8a30-64c08b6720f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:20Z\\\",\\\"message\\\":\\\"2025-12-03T06:45:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9\\\\n2025-12-03T06:45:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e060c49a-fe61-4a85-9c90-496b6bf089f9 to /host/opt/cni/bin/\\\\n2025-12-03T06:45:35Z [verbose] multus-daemon started\\\\n2025-12-03T06:45:35Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:46:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pdk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9b2j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.888426 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f42839e-dbc4-445a-a15b-c3aa14813958\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:46:36Z\\\",\\\"message\\\":\\\"model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 06:46:36.095528 6471 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-hq2rn in node crc\\\\nI1203 06:46:36.095538 6471 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-hq2rn] creating logical port openshift-multus_network-metrics-daemon-hq2rn for pod on switch crc\\\\nI1203 06:46:36.095506 6471 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF1203 06:46:36.095556 6471 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:46:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppdm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g9t4l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.896513 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b24b6a4-c126-4d6d-88ae-b270b4743110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:46:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32947be5ce5a85090284dbc3edd8ad437495db9f0b4a7310656e38ecf5c649de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f26dedc0507a8675c0dc842b67772e84b5276713808e656bcf620ebb7bd3f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17d8916a080f159b25abbfd9575bdc197c58bf256dbeb6367e74368f5b7f1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ea750eb608c854e92aa32dfc7d2085a0c00c3554368c7119487e4a730fdc1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:45:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:45:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.905211 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f651c16a4a98ff0a9b4783e60ece4c410d5fcb7d05ad42bf7842d8bb8a99f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.912946 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30d9a05de148a1dbe0fa8f07bbc5f4f2c3cba395d686af03f2da63f8cdfe431c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cf8d993193bca34b30ea77c473af45652fde6e73d0586efb78c14b9d003e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.919878 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6444fe7571ebb90d4ff4b30dc1a397023310b50b1816d0197cb545b4f5f7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.927661 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.933852 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pcw7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c1979d0-303c-4cf6-9087-3cb2e1aac73b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebaa73cf4e1efd781b258dd26910dc004392716180b14a7e64e89a03f2032a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:45:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pcw7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.940297 4475 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9dd470-572a-4396-9be7-1a37e3c48977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:45:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg4hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:45:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq2rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:46:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.968705 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.968728 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.968737 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.968748 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:37 crc kubenswrapper[4475]: I1203 06:46:37.968758 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:37Z","lastTransitionTime":"2025-12-03T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.071025 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.071050 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.071057 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.071069 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.071078 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:38Z","lastTransitionTime":"2025-12-03T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.172793 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.172824 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.172832 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.172843 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.172851 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:38Z","lastTransitionTime":"2025-12-03T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.274714 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.274766 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.274775 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.274790 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.274800 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:38Z","lastTransitionTime":"2025-12-03T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.376542 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.376583 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.376592 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.376606 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.376615 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:38Z","lastTransitionTime":"2025-12-03T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.478671 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.478721 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.478731 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.478746 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.478755 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:38Z","lastTransitionTime":"2025-12-03T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.491131 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.491173 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.491191 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:38 crc kubenswrapper[4475]: E1203 06:46:38.491364 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:38 crc kubenswrapper[4475]: E1203 06:46:38.491583 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:38 crc kubenswrapper[4475]: E1203 06:46:38.491677 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.580074 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.580102 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.580113 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.580125 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.580139 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:38Z","lastTransitionTime":"2025-12-03T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.681881 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.682015 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.682073 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.682133 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.682185 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:38Z","lastTransitionTime":"2025-12-03T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.783550 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.783590 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.783600 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.783612 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.783622 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:38Z","lastTransitionTime":"2025-12-03T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.885627 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.885662 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.885672 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.885685 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.885696 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:38Z","lastTransitionTime":"2025-12-03T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.987902 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.987935 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.987945 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.987959 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:38 crc kubenswrapper[4475]: I1203 06:46:38.987967 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:38Z","lastTransitionTime":"2025-12-03T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.089960 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.089999 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.090010 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.090023 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.090032 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:39Z","lastTransitionTime":"2025-12-03T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.191686 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.191711 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.191721 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.191731 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.191738 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:39Z","lastTransitionTime":"2025-12-03T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.293518 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.293593 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.293603 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.293615 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.293623 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:39Z","lastTransitionTime":"2025-12-03T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.395604 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.395632 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.395640 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.395650 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.395657 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:39Z","lastTransitionTime":"2025-12-03T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.490908 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:39 crc kubenswrapper[4475]: E1203 06:46:39.491031 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.496567 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.496605 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.496615 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.496625 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.496633 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:39Z","lastTransitionTime":"2025-12-03T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.499037 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.597865 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.597885 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.597892 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.597901 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.597909 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:39Z","lastTransitionTime":"2025-12-03T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.699626 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.699663 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.699673 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.699685 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.699694 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:39Z","lastTransitionTime":"2025-12-03T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.801419 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.801480 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.801491 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.801504 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.801515 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:39Z","lastTransitionTime":"2025-12-03T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.903556 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.903596 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.903605 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.903616 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:39 crc kubenswrapper[4475]: I1203 06:46:39.903627 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:39Z","lastTransitionTime":"2025-12-03T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.005590 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.005748 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.005768 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.005780 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.005796 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:40Z","lastTransitionTime":"2025-12-03T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.108011 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.108044 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.108052 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.108065 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.108074 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:40Z","lastTransitionTime":"2025-12-03T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.209825 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.209860 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.209868 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.209882 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.209891 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:40Z","lastTransitionTime":"2025-12-03T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.311539 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.311573 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.311593 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.311609 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.311617 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:40Z","lastTransitionTime":"2025-12-03T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.413089 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.413122 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.413131 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.413142 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.413151 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:40Z","lastTransitionTime":"2025-12-03T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.490773 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.490858 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.491006 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:40 crc kubenswrapper[4475]: E1203 06:46:40.491081 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:40 crc kubenswrapper[4475]: E1203 06:46:40.491259 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:40 crc kubenswrapper[4475]: E1203 06:46:40.491306 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.514622 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.514668 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.514677 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.514690 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.514698 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:40Z","lastTransitionTime":"2025-12-03T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.616689 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.616720 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.616729 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.616741 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.616749 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:40Z","lastTransitionTime":"2025-12-03T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.718519 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.718544 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.718551 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.718563 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.718570 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:40Z","lastTransitionTime":"2025-12-03T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.820377 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.820401 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.820409 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.820420 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.820427 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:40Z","lastTransitionTime":"2025-12-03T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.922152 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.922175 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.922183 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.922193 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:40 crc kubenswrapper[4475]: I1203 06:46:40.922201 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:40Z","lastTransitionTime":"2025-12-03T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.024098 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.024127 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.024135 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.024145 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.024153 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:41Z","lastTransitionTime":"2025-12-03T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.125750 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.125776 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.125784 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.125795 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.125804 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:41Z","lastTransitionTime":"2025-12-03T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.227379 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.227405 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.227413 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.227422 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.227430 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:41Z","lastTransitionTime":"2025-12-03T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.329198 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.329219 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.329226 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.329235 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.329242 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:41Z","lastTransitionTime":"2025-12-03T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.431009 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.431042 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.431050 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.431064 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.431072 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:41Z","lastTransitionTime":"2025-12-03T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.490615 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:41 crc kubenswrapper[4475]: E1203 06:46:41.490709 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.532893 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.532917 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.532925 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.532934 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.532941 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:41Z","lastTransitionTime":"2025-12-03T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.634333 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.634365 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.634374 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.634388 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.634396 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:41Z","lastTransitionTime":"2025-12-03T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.736543 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.736573 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.736592 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.736605 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.736613 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:41Z","lastTransitionTime":"2025-12-03T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.838631 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.838664 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.838672 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.838683 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.838690 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:41Z","lastTransitionTime":"2025-12-03T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.940370 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.940406 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.940415 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.940428 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:41 crc kubenswrapper[4475]: I1203 06:46:41.940437 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:41Z","lastTransitionTime":"2025-12-03T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.041899 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.041930 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.041940 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.041951 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.041960 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:42Z","lastTransitionTime":"2025-12-03T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.144171 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.144198 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.144206 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.144217 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.144225 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:42Z","lastTransitionTime":"2025-12-03T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.246360 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.246386 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.246394 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.246405 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.246413 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:42Z","lastTransitionTime":"2025-12-03T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.347942 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.347971 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.347980 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.347992 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.348000 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:42Z","lastTransitionTime":"2025-12-03T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.450013 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.450041 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.450055 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.450066 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.450074 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:42Z","lastTransitionTime":"2025-12-03T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.490442 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.490505 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:42 crc kubenswrapper[4475]: E1203 06:46:42.490542 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.490442 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:42 crc kubenswrapper[4475]: E1203 06:46:42.490577 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:42 crc kubenswrapper[4475]: E1203 06:46:42.490607 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.551667 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.551708 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.551718 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.551732 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.551747 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:42Z","lastTransitionTime":"2025-12-03T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.653526 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.653567 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.653577 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.653601 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.653610 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:42Z","lastTransitionTime":"2025-12-03T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.756238 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.756279 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.756289 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.756304 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.756311 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:42Z","lastTransitionTime":"2025-12-03T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.858492 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.858519 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.858528 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.858538 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.858546 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:42Z","lastTransitionTime":"2025-12-03T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.960146 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.960170 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.960177 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.960189 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:42 crc kubenswrapper[4475]: I1203 06:46:42.960197 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:42Z","lastTransitionTime":"2025-12-03T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.061872 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.061893 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.061901 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.061913 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.061920 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:43Z","lastTransitionTime":"2025-12-03T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.164205 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.164229 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.164237 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.164246 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.164253 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:43Z","lastTransitionTime":"2025-12-03T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.265841 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.265860 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.265868 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.265876 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.265883 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:43Z","lastTransitionTime":"2025-12-03T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.367073 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.367096 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.367104 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.367114 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.367121 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:43Z","lastTransitionTime":"2025-12-03T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.469137 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.469161 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.469188 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.469198 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.469206 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:43Z","lastTransitionTime":"2025-12-03T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.490554 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:43 crc kubenswrapper[4475]: E1203 06:46:43.490697 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.570884 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.570906 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.570913 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.570922 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.570929 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:43Z","lastTransitionTime":"2025-12-03T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.672389 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.672576 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.672584 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.672603 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.672611 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:43Z","lastTransitionTime":"2025-12-03T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.774627 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.774651 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.774659 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.774668 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.774675 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:43Z","lastTransitionTime":"2025-12-03T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.876357 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.876391 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.876399 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.876413 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.876423 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:43Z","lastTransitionTime":"2025-12-03T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.978150 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.978177 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.978185 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.978212 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:43 crc kubenswrapper[4475]: I1203 06:46:43.978219 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:43Z","lastTransitionTime":"2025-12-03T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.079767 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.079803 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.079811 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.079825 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.079833 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:44Z","lastTransitionTime":"2025-12-03T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.181308 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.181336 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.181343 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.181353 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.181360 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:44Z","lastTransitionTime":"2025-12-03T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.282677 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.282717 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.282728 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.282742 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.282753 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:44Z","lastTransitionTime":"2025-12-03T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.384395 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.384426 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.384435 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.384447 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.384477 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:44Z","lastTransitionTime":"2025-12-03T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.486482 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.486530 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.486539 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.486550 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.486558 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:44Z","lastTransitionTime":"2025-12-03T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.490708 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:44 crc kubenswrapper[4475]: E1203 06:46:44.490797 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.490807 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.490830 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:44 crc kubenswrapper[4475]: E1203 06:46:44.490872 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:44 crc kubenswrapper[4475]: E1203 06:46:44.491020 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.589898 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.589945 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.589955 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.589966 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.589973 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:44Z","lastTransitionTime":"2025-12-03T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.692270 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.692299 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.692308 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.692317 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.692359 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:44Z","lastTransitionTime":"2025-12-03T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.793990 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.794021 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.794031 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.794061 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.794072 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:44Z","lastTransitionTime":"2025-12-03T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.896200 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.896220 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.896227 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.896235 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.896242 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:44Z","lastTransitionTime":"2025-12-03T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.997193 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.997218 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.997226 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.997236 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:44 crc kubenswrapper[4475]: I1203 06:46:44.997243 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:44Z","lastTransitionTime":"2025-12-03T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.099122 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.099192 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.099203 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.099216 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.099224 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:45Z","lastTransitionTime":"2025-12-03T06:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.201204 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.201237 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.201245 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.201261 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.201270 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:45Z","lastTransitionTime":"2025-12-03T06:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.302762 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.302789 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.302800 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.302814 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.302825 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:45Z","lastTransitionTime":"2025-12-03T06:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.405349 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.405378 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.405386 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.405396 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.405403 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:45Z","lastTransitionTime":"2025-12-03T06:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.490216 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:45 crc kubenswrapper[4475]: E1203 06:46:45.490935 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.504195 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=6.504182812 podStartE2EDuration="6.504182812s" podCreationTimestamp="2025-12-03 06:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:45.503782497 +0000 UTC m=+90.308680832" watchObservedRunningTime="2025-12-03 06:46:45.504182812 +0000 UTC m=+90.309081146" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.506736 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.506763 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.506772 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.506784 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.506793 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:45Z","lastTransitionTime":"2025-12-03T06:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.528466 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=70.528431223 podStartE2EDuration="1m10.528431223s" podCreationTimestamp="2025-12-03 06:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:45.521437982 +0000 UTC m=+90.326336316" watchObservedRunningTime="2025-12-03 06:46:45.528431223 +0000 UTC m=+90.333329567" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.528617 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dqbgx" podStartSLOduration=71.528596804 podStartE2EDuration="1m11.528596804s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:45.528407688 +0000 UTC m=+90.333306013" watchObservedRunningTime="2025-12-03 06:46:45.528596804 +0000 UTC m=+90.333495138" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.559243 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9b2j8" podStartSLOduration=71.559228005 podStartE2EDuration="1m11.559228005s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:45.538464983 +0000 UTC m=+90.343363317" watchObservedRunningTime="2025-12-03 06:46:45.559228005 +0000 UTC m=+90.364126339" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.567657 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.567644878 podStartE2EDuration="43.567644878s" podCreationTimestamp="2025-12-03 06:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:45.567524793 +0000 UTC m=+90.372423127" watchObservedRunningTime="2025-12-03 06:46:45.567644878 +0000 UTC m=+90.372543213" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.608494 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.608701 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.608765 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.608820 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.608882 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:45Z","lastTransitionTime":"2025-12-03T06:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.613239 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pcw7j" podStartSLOduration=71.613226085 podStartE2EDuration="1m11.613226085s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:45.613088235 +0000 UTC m=+90.417986570" watchObservedRunningTime="2025-12-03 06:46:45.613226085 +0000 UTC m=+90.418124419" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.640068 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-k9cmc" podStartSLOduration=71.640054168 podStartE2EDuration="1m11.640054168s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:45.639561249 +0000 UTC m=+90.444459583" watchObservedRunningTime="2025-12-03 06:46:45.640054168 +0000 UTC m=+90.444952502" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.659573 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=73.659556344 podStartE2EDuration="1m13.659556344s" podCreationTimestamp="2025-12-03 06:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:45.659306123 +0000 UTC m=+90.464204447" watchObservedRunningTime="2025-12-03 06:46:45.659556344 +0000 UTC m=+90.464454678" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.660176 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=73.660168377 podStartE2EDuration="1m13.660168377s" podCreationTimestamp="2025-12-03 06:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:45.649769559 +0000 UTC m=+90.454667893" watchObservedRunningTime="2025-12-03 06:46:45.660168377 +0000 UTC m=+90.465066701" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.685775 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podStartSLOduration=71.685759488 podStartE2EDuration="1m11.685759488s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:45.685565282 +0000 UTC m=+90.490463617" watchObservedRunningTime="2025-12-03 06:46:45.685759488 +0000 UTC m=+90.490657823" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.701941 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sbkp5" podStartSLOduration=71.701926349 podStartE2EDuration="1m11.701926349s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:45.701165755 +0000 UTC m=+90.506064089" watchObservedRunningTime="2025-12-03 06:46:45.701926349 +0000 UTC m=+90.506824683" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.710521 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.710557 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.710568 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.710579 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.710587 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:45Z","lastTransitionTime":"2025-12-03T06:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.812786 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.812812 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.812821 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.812833 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.812841 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:45Z","lastTransitionTime":"2025-12-03T06:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.914039 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.914244 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.914312 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.914373 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:45 crc kubenswrapper[4475]: I1203 06:46:45.914435 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:45Z","lastTransitionTime":"2025-12-03T06:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.016863 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.016893 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.016901 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.016914 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.016923 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:46Z","lastTransitionTime":"2025-12-03T06:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.119082 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.119107 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.119116 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.119129 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.119137 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:46Z","lastTransitionTime":"2025-12-03T06:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.221053 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.221084 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.221091 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.221104 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.221112 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:46Z","lastTransitionTime":"2025-12-03T06:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.322686 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.322723 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.322734 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.322746 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.322754 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:46Z","lastTransitionTime":"2025-12-03T06:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.424486 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.424519 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.424529 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.424545 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.424554 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:46Z","lastTransitionTime":"2025-12-03T06:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.490150 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.490182 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.490204 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:46 crc kubenswrapper[4475]: E1203 06:46:46.490274 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:46 crc kubenswrapper[4475]: E1203 06:46:46.490283 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:46 crc kubenswrapper[4475]: E1203 06:46:46.490336 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.526435 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.526551 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.526624 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.526699 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.526760 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:46Z","lastTransitionTime":"2025-12-03T06:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.628855 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.628900 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.628909 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.628922 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.628931 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:46Z","lastTransitionTime":"2025-12-03T06:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.730385 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.730418 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.730427 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.730440 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.730460 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:46Z","lastTransitionTime":"2025-12-03T06:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.831824 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.831883 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.831893 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.831904 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.831912 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:46Z","lastTransitionTime":"2025-12-03T06:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.933546 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.933576 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.933584 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.933595 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:46 crc kubenswrapper[4475]: I1203 06:46:46.933604 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:46Z","lastTransitionTime":"2025-12-03T06:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.035209 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.035241 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.035249 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.035260 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.035268 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:47Z","lastTransitionTime":"2025-12-03T06:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.137513 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.137548 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.137557 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.137575 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.137582 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:47Z","lastTransitionTime":"2025-12-03T06:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.238955 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.238986 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.238995 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.239007 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.239015 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:47Z","lastTransitionTime":"2025-12-03T06:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.340569 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.340595 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.340606 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.340632 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.340641 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:47Z","lastTransitionTime":"2025-12-03T06:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.389573 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.389622 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.389634 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.389645 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.389653 4475 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:46:47Z","lastTransitionTime":"2025-12-03T06:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.415774 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x"] Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.416072 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.419954 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.419987 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.420054 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.420469 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.491069 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:47 crc kubenswrapper[4475]: E1203 06:46:47.491169 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.556756 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d405df1-0f49-4ae5-bacd-a16143fd2725-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.556807 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1d405df1-0f49-4ae5-bacd-a16143fd2725-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.556826 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d405df1-0f49-4ae5-bacd-a16143fd2725-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.556840 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1d405df1-0f49-4ae5-bacd-a16143fd2725-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.556889 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d405df1-0f49-4ae5-bacd-a16143fd2725-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.657652 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1d405df1-0f49-4ae5-bacd-a16143fd2725-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.657678 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1d405df1-0f49-4ae5-bacd-a16143fd2725-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.657694 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d405df1-0f49-4ae5-bacd-a16143fd2725-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.657715 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d405df1-0f49-4ae5-bacd-a16143fd2725-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.657743 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d405df1-0f49-4ae5-bacd-a16143fd2725-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.657786 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1d405df1-0f49-4ae5-bacd-a16143fd2725-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.657791 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1d405df1-0f49-4ae5-bacd-a16143fd2725-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.658539 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d405df1-0f49-4ae5-bacd-a16143fd2725-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.661853 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d405df1-0f49-4ae5-bacd-a16143fd2725-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.669770 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d405df1-0f49-4ae5-bacd-a16143fd2725-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ssf8x\" (UID: \"1d405df1-0f49-4ae5-bacd-a16143fd2725\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.726696 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" Dec 03 06:46:47 crc kubenswrapper[4475]: W1203 06:46:47.736072 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d405df1_0f49_4ae5_bacd_a16143fd2725.slice/crio-78b7f0df7a409986c9ca682121159ab60de3f666e0c97fbb51a3ec242e90233e WatchSource:0}: Error finding container 78b7f0df7a409986c9ca682121159ab60de3f666e0c97fbb51a3ec242e90233e: Status 404 returned error can't find the container with id 78b7f0df7a409986c9ca682121159ab60de3f666e0c97fbb51a3ec242e90233e Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.803887 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" event={"ID":"1d405df1-0f49-4ae5-bacd-a16143fd2725","Type":"ContainerStarted","Data":"c90da0078f6a988bb0a8f16dc9ff38d6fe3ce79b1f949065eb1732aaf596af36"} Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.803927 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" event={"ID":"1d405df1-0f49-4ae5-bacd-a16143fd2725","Type":"ContainerStarted","Data":"78b7f0df7a409986c9ca682121159ab60de3f666e0c97fbb51a3ec242e90233e"} Dec 03 06:46:47 crc kubenswrapper[4475]: I1203 06:46:47.812879 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ssf8x" podStartSLOduration=73.812867114 podStartE2EDuration="1m13.812867114s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:47.812524288 +0000 UTC m=+92.617422622" watchObservedRunningTime="2025-12-03 06:46:47.812867114 +0000 UTC m=+92.617765448" Dec 03 06:46:48 crc kubenswrapper[4475]: I1203 06:46:48.490770 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:48 crc kubenswrapper[4475]: E1203 06:46:48.490887 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:48 crc kubenswrapper[4475]: I1203 06:46:48.491025 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:48 crc kubenswrapper[4475]: E1203 06:46:48.491064 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:48 crc kubenswrapper[4475]: I1203 06:46:48.490795 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:48 crc kubenswrapper[4475]: E1203 06:46:48.491596 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:49 crc kubenswrapper[4475]: I1203 06:46:49.490434 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:49 crc kubenswrapper[4475]: E1203 06:46:49.490682 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:50 crc kubenswrapper[4475]: I1203 06:46:50.490164 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:50 crc kubenswrapper[4475]: I1203 06:46:50.490236 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:50 crc kubenswrapper[4475]: I1203 06:46:50.490795 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:50 crc kubenswrapper[4475]: E1203 06:46:50.490871 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:50 crc kubenswrapper[4475]: E1203 06:46:50.490969 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:50 crc kubenswrapper[4475]: E1203 06:46:50.491052 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:51 crc kubenswrapper[4475]: I1203 06:46:51.490208 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:51 crc kubenswrapper[4475]: E1203 06:46:51.490296 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:51 crc kubenswrapper[4475]: I1203 06:46:51.490893 4475 scope.go:117] "RemoveContainer" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" Dec 03 06:46:51 crc kubenswrapper[4475]: E1203 06:46:51.491007 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" Dec 03 06:46:51 crc kubenswrapper[4475]: I1203 06:46:51.893171 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:51 crc kubenswrapper[4475]: E1203 06:46:51.893312 4475 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:46:51 crc kubenswrapper[4475]: E1203 06:46:51.893647 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs podName:7e9dd470-572a-4396-9be7-1a37e3c48977 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:55.893626802 +0000 UTC m=+160.698525137 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs") pod "network-metrics-daemon-hq2rn" (UID: "7e9dd470-572a-4396-9be7-1a37e3c48977") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:46:52 crc kubenswrapper[4475]: I1203 06:46:52.491102 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:52 crc kubenswrapper[4475]: I1203 06:46:52.491139 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:52 crc kubenswrapper[4475]: E1203 06:46:52.491192 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:52 crc kubenswrapper[4475]: E1203 06:46:52.491260 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:52 crc kubenswrapper[4475]: I1203 06:46:52.491110 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:52 crc kubenswrapper[4475]: E1203 06:46:52.491316 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:53 crc kubenswrapper[4475]: I1203 06:46:53.490891 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:53 crc kubenswrapper[4475]: E1203 06:46:53.490996 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:54 crc kubenswrapper[4475]: I1203 06:46:54.490532 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:54 crc kubenswrapper[4475]: I1203 06:46:54.490565 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:54 crc kubenswrapper[4475]: I1203 06:46:54.490655 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:54 crc kubenswrapper[4475]: E1203 06:46:54.490771 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:54 crc kubenswrapper[4475]: E1203 06:46:54.490870 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:54 crc kubenswrapper[4475]: E1203 06:46:54.490917 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:55 crc kubenswrapper[4475]: I1203 06:46:55.491098 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:55 crc kubenswrapper[4475]: E1203 06:46:55.492082 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:56 crc kubenswrapper[4475]: I1203 06:46:56.491182 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:56 crc kubenswrapper[4475]: I1203 06:46:56.491277 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:56 crc kubenswrapper[4475]: I1203 06:46:56.491354 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:56 crc kubenswrapper[4475]: E1203 06:46:56.491351 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:56 crc kubenswrapper[4475]: E1203 06:46:56.491412 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:56 crc kubenswrapper[4475]: E1203 06:46:56.491508 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:57 crc kubenswrapper[4475]: I1203 06:46:57.491012 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:57 crc kubenswrapper[4475]: E1203 06:46:57.491116 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:46:58 crc kubenswrapper[4475]: I1203 06:46:58.491007 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:46:58 crc kubenswrapper[4475]: E1203 06:46:58.491108 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:46:58 crc kubenswrapper[4475]: I1203 06:46:58.491313 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:46:58 crc kubenswrapper[4475]: I1203 06:46:58.491385 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:46:58 crc kubenswrapper[4475]: E1203 06:46:58.491770 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:46:58 crc kubenswrapper[4475]: E1203 06:46:58.491823 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:46:59 crc kubenswrapper[4475]: I1203 06:46:59.490351 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:46:59 crc kubenswrapper[4475]: E1203 06:46:59.490469 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:00 crc kubenswrapper[4475]: I1203 06:47:00.490876 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:00 crc kubenswrapper[4475]: E1203 06:47:00.490968 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:00 crc kubenswrapper[4475]: I1203 06:47:00.491000 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:00 crc kubenswrapper[4475]: I1203 06:47:00.491026 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:00 crc kubenswrapper[4475]: E1203 06:47:00.491088 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:00 crc kubenswrapper[4475]: E1203 06:47:00.491190 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:01 crc kubenswrapper[4475]: I1203 06:47:01.490896 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:01 crc kubenswrapper[4475]: E1203 06:47:01.490995 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:02 crc kubenswrapper[4475]: I1203 06:47:02.490584 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:02 crc kubenswrapper[4475]: I1203 06:47:02.490618 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:02 crc kubenswrapper[4475]: E1203 06:47:02.490695 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:02 crc kubenswrapper[4475]: I1203 06:47:02.490592 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:02 crc kubenswrapper[4475]: E1203 06:47:02.490760 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:02 crc kubenswrapper[4475]: E1203 06:47:02.490885 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:03 crc kubenswrapper[4475]: I1203 06:47:03.490803 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:03 crc kubenswrapper[4475]: E1203 06:47:03.490909 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:04 crc kubenswrapper[4475]: I1203 06:47:04.490846 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:04 crc kubenswrapper[4475]: E1203 06:47:04.491352 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:04 crc kubenswrapper[4475]: I1203 06:47:04.490910 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:04 crc kubenswrapper[4475]: I1203 06:47:04.490846 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:04 crc kubenswrapper[4475]: E1203 06:47:04.491665 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:04 crc kubenswrapper[4475]: E1203 06:47:04.491565 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:05 crc kubenswrapper[4475]: I1203 06:47:05.490621 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:05 crc kubenswrapper[4475]: E1203 06:47:05.491459 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:06 crc kubenswrapper[4475]: I1203 06:47:06.490751 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:06 crc kubenswrapper[4475]: I1203 06:47:06.490751 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:06 crc kubenswrapper[4475]: I1203 06:47:06.490846 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:06 crc kubenswrapper[4475]: E1203 06:47:06.490976 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:06 crc kubenswrapper[4475]: E1203 06:47:06.491089 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:06 crc kubenswrapper[4475]: E1203 06:47:06.491426 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:06 crc kubenswrapper[4475]: I1203 06:47:06.491684 4475 scope.go:117] "RemoveContainer" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" Dec 03 06:47:06 crc kubenswrapper[4475]: E1203 06:47:06.492090 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g9t4l_openshift-ovn-kubernetes(8f42839e-dbc4-445a-a15b-c3aa14813958)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" Dec 03 06:47:07 crc kubenswrapper[4475]: I1203 06:47:07.490568 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:07 crc kubenswrapper[4475]: E1203 06:47:07.490687 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:07 crc kubenswrapper[4475]: I1203 06:47:07.843624 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9b2j8_f3a17c67-95e0-4889-8a30-64c08b6720f4/kube-multus/1.log" Dec 03 06:47:07 crc kubenswrapper[4475]: I1203 06:47:07.844280 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9b2j8_f3a17c67-95e0-4889-8a30-64c08b6720f4/kube-multus/0.log" Dec 03 06:47:07 crc kubenswrapper[4475]: I1203 06:47:07.844306 4475 generic.go:334] "Generic (PLEG): container finished" podID="f3a17c67-95e0-4889-8a30-64c08b6720f4" containerID="4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685" exitCode=1 Dec 03 06:47:07 crc kubenswrapper[4475]: I1203 06:47:07.844328 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9b2j8" event={"ID":"f3a17c67-95e0-4889-8a30-64c08b6720f4","Type":"ContainerDied","Data":"4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685"} Dec 03 06:47:07 crc kubenswrapper[4475]: I1203 06:47:07.844352 4475 scope.go:117] "RemoveContainer" containerID="d2d627e2c307a8db9c86e8020f2b1c25c6e061e0c6460be63e231566488beaca" Dec 03 06:47:07 crc kubenswrapper[4475]: I1203 06:47:07.844646 4475 scope.go:117] "RemoveContainer" containerID="4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685" Dec 03 06:47:07 crc kubenswrapper[4475]: E1203 06:47:07.844799 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-9b2j8_openshift-multus(f3a17c67-95e0-4889-8a30-64c08b6720f4)\"" pod="openshift-multus/multus-9b2j8" podUID="f3a17c67-95e0-4889-8a30-64c08b6720f4" Dec 03 06:47:08 crc kubenswrapper[4475]: I1203 06:47:08.491156 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:08 crc kubenswrapper[4475]: I1203 06:47:08.491190 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:08 crc kubenswrapper[4475]: I1203 06:47:08.491156 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:08 crc kubenswrapper[4475]: E1203 06:47:08.491251 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:08 crc kubenswrapper[4475]: E1203 06:47:08.491316 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:08 crc kubenswrapper[4475]: E1203 06:47:08.491360 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:08 crc kubenswrapper[4475]: I1203 06:47:08.847935 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9b2j8_f3a17c67-95e0-4889-8a30-64c08b6720f4/kube-multus/1.log" Dec 03 06:47:09 crc kubenswrapper[4475]: I1203 06:47:09.490394 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:09 crc kubenswrapper[4475]: E1203 06:47:09.490506 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:10 crc kubenswrapper[4475]: I1203 06:47:10.490593 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:10 crc kubenswrapper[4475]: I1203 06:47:10.490651 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:10 crc kubenswrapper[4475]: I1203 06:47:10.490662 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:10 crc kubenswrapper[4475]: E1203 06:47:10.490741 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:10 crc kubenswrapper[4475]: E1203 06:47:10.490790 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:10 crc kubenswrapper[4475]: E1203 06:47:10.490836 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:11 crc kubenswrapper[4475]: I1203 06:47:11.491095 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:11 crc kubenswrapper[4475]: E1203 06:47:11.491204 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:12 crc kubenswrapper[4475]: I1203 06:47:12.490708 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:12 crc kubenswrapper[4475]: E1203 06:47:12.490838 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:12 crc kubenswrapper[4475]: I1203 06:47:12.490876 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:12 crc kubenswrapper[4475]: I1203 06:47:12.490895 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:12 crc kubenswrapper[4475]: E1203 06:47:12.490963 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:12 crc kubenswrapper[4475]: E1203 06:47:12.491017 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:13 crc kubenswrapper[4475]: I1203 06:47:13.490390 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:13 crc kubenswrapper[4475]: E1203 06:47:13.490503 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:14 crc kubenswrapper[4475]: I1203 06:47:14.490136 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:14 crc kubenswrapper[4475]: I1203 06:47:14.490161 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:14 crc kubenswrapper[4475]: I1203 06:47:14.490184 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:14 crc kubenswrapper[4475]: E1203 06:47:14.490261 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:14 crc kubenswrapper[4475]: E1203 06:47:14.490321 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:14 crc kubenswrapper[4475]: E1203 06:47:14.490374 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:15 crc kubenswrapper[4475]: I1203 06:47:15.490750 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:15 crc kubenswrapper[4475]: E1203 06:47:15.491408 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:15 crc kubenswrapper[4475]: E1203 06:47:15.521188 4475 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 06:47:15 crc kubenswrapper[4475]: E1203 06:47:15.556797 4475 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 06:47:16 crc kubenswrapper[4475]: I1203 06:47:16.490996 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:16 crc kubenswrapper[4475]: I1203 06:47:16.491083 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:16 crc kubenswrapper[4475]: E1203 06:47:16.491111 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:16 crc kubenswrapper[4475]: E1203 06:47:16.491179 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:16 crc kubenswrapper[4475]: I1203 06:47:16.491226 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:16 crc kubenswrapper[4475]: E1203 06:47:16.491262 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:17 crc kubenswrapper[4475]: I1203 06:47:17.490636 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:17 crc kubenswrapper[4475]: E1203 06:47:17.490773 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:18 crc kubenswrapper[4475]: I1203 06:47:18.490472 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:18 crc kubenswrapper[4475]: I1203 06:47:18.490488 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:18 crc kubenswrapper[4475]: E1203 06:47:18.490623 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:18 crc kubenswrapper[4475]: E1203 06:47:18.490674 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:18 crc kubenswrapper[4475]: I1203 06:47:18.490833 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:18 crc kubenswrapper[4475]: I1203 06:47:18.490895 4475 scope.go:117] "RemoveContainer" containerID="4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685" Dec 03 06:47:18 crc kubenswrapper[4475]: E1203 06:47:18.490895 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:18 crc kubenswrapper[4475]: I1203 06:47:18.491255 4475 scope.go:117] "RemoveContainer" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" Dec 03 06:47:18 crc kubenswrapper[4475]: I1203 06:47:18.869969 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/3.log" Dec 03 06:47:18 crc kubenswrapper[4475]: I1203 06:47:18.872248 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerStarted","Data":"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88"} Dec 03 06:47:18 crc kubenswrapper[4475]: I1203 06:47:18.872551 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:47:18 crc kubenswrapper[4475]: I1203 06:47:18.873933 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9b2j8_f3a17c67-95e0-4889-8a30-64c08b6720f4/kube-multus/1.log" Dec 03 06:47:18 crc kubenswrapper[4475]: I1203 06:47:18.873972 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9b2j8" event={"ID":"f3a17c67-95e0-4889-8a30-64c08b6720f4","Type":"ContainerStarted","Data":"2e2971b82e4f9806c53d67763a76ebe8ebaaf116ff13a887e7d02d3fd665eafe"} Dec 03 06:47:18 crc kubenswrapper[4475]: I1203 06:47:18.892255 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podStartSLOduration=104.892241777 podStartE2EDuration="1m44.892241777s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:18.890812734 +0000 UTC m=+123.695711068" watchObservedRunningTime="2025-12-03 06:47:18.892241777 +0000 UTC m=+123.697140111" Dec 03 06:47:19 crc kubenswrapper[4475]: I1203 06:47:19.119288 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hq2rn"] Dec 03 06:47:19 crc kubenswrapper[4475]: I1203 06:47:19.119385 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:19 crc kubenswrapper[4475]: E1203 06:47:19.119476 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:20 crc kubenswrapper[4475]: I1203 06:47:20.490560 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:20 crc kubenswrapper[4475]: E1203 06:47:20.490833 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:20 crc kubenswrapper[4475]: I1203 06:47:20.490562 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:20 crc kubenswrapper[4475]: I1203 06:47:20.490562 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:20 crc kubenswrapper[4475]: E1203 06:47:20.491006 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:20 crc kubenswrapper[4475]: E1203 06:47:20.490944 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:20 crc kubenswrapper[4475]: E1203 06:47:20.557534 4475 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 06:47:21 crc kubenswrapper[4475]: I1203 06:47:21.490780 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:21 crc kubenswrapper[4475]: E1203 06:47:21.490893 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:22 crc kubenswrapper[4475]: I1203 06:47:22.490748 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:22 crc kubenswrapper[4475]: E1203 06:47:22.490854 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:22 crc kubenswrapper[4475]: I1203 06:47:22.491007 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:22 crc kubenswrapper[4475]: E1203 06:47:22.491052 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:22 crc kubenswrapper[4475]: I1203 06:47:22.491359 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:22 crc kubenswrapper[4475]: E1203 06:47:22.491543 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:23 crc kubenswrapper[4475]: I1203 06:47:23.490834 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:23 crc kubenswrapper[4475]: E1203 06:47:23.490943 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:24 crc kubenswrapper[4475]: I1203 06:47:24.225702 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:47:24 crc kubenswrapper[4475]: I1203 06:47:24.491021 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:24 crc kubenswrapper[4475]: I1203 06:47:24.491070 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:24 crc kubenswrapper[4475]: I1203 06:47:24.491026 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:24 crc kubenswrapper[4475]: E1203 06:47:24.491137 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:47:24 crc kubenswrapper[4475]: E1203 06:47:24.491236 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:47:24 crc kubenswrapper[4475]: E1203 06:47:24.491307 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:47:25 crc kubenswrapper[4475]: I1203 06:47:25.491171 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:25 crc kubenswrapper[4475]: E1203 06:47:25.491844 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq2rn" podUID="7e9dd470-572a-4396-9be7-1a37e3c48977" Dec 03 06:47:26 crc kubenswrapper[4475]: I1203 06:47:26.490654 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:26 crc kubenswrapper[4475]: I1203 06:47:26.490716 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:26 crc kubenswrapper[4475]: I1203 06:47:26.490954 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:26 crc kubenswrapper[4475]: I1203 06:47:26.492212 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 06:47:26 crc kubenswrapper[4475]: I1203 06:47:26.492314 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 06:47:26 crc kubenswrapper[4475]: I1203 06:47:26.492776 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 06:47:26 crc kubenswrapper[4475]: I1203 06:47:26.493764 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 06:47:27 crc kubenswrapper[4475]: I1203 06:47:27.490284 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:27 crc kubenswrapper[4475]: I1203 06:47:27.491962 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 06:47:27 crc kubenswrapper[4475]: I1203 06:47:27.491984 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.165780 4475 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.189176 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7m86k"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.189595 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.190056 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-dbxhk"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.190317 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7kcnv"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.190351 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.190545 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.194284 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.194580 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.196498 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.196702 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.196970 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.197064 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.197720 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.197943 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-n4bbj"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.198131 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.198345 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.201377 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.202045 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.204228 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.204893 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.205571 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.205768 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.205865 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r542"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.206105 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.206196 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.206242 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.209258 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.209403 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.209600 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.221263 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.221670 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.222125 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.222759 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.223371 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.223906 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.225636 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.235256 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.235466 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.235557 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.235688 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.235708 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.236298 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.236382 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.236504 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.236528 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.236663 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.236688 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.236755 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.236769 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.236963 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.237044 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.237137 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.237149 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.237223 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.237296 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.237422 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.237574 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.237651 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.237823 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.237991 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.238094 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.239263 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.239814 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.240093 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jf25k"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.240294 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.240526 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.240602 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dbjfd"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.240880 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.240916 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dbjfd" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.240534 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.240682 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.243588 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.244187 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.247490 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6lqs9"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.247931 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.248099 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.248536 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.251912 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.253595 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.253866 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.255212 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.255694 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.255838 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.255893 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.255962 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256108 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256175 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256210 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256183 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256345 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256405 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256411 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256497 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256506 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256564 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256600 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256628 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256661 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256685 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256687 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256765 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.256781 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.258480 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kfvwc"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.258790 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.259065 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.259218 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.259924 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.259980 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.260162 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.259987 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.260285 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.260115 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.260258 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.260440 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.260536 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.260628 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.260695 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.266682 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.266908 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.267014 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.267115 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.267199 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.268720 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.268847 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.268959 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.269066 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.260485 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.269725 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9dd2n"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.271017 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.272850 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.273127 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.289860 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290105 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290523 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6g4x\" (UniqueName: \"kubernetes.io/projected/30615409-a282-4405-afab-4802d9c27a3a-kube-api-access-h6g4x\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290559 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fab1c20e-bbf0-442f-ada0-5647d493ad6c-encryption-config\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290577 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290594 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-client-ca\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290618 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290634 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290648 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-trusted-ca-bundle\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290660 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fab1c20e-bbf0-442f-ada0-5647d493ad6c-audit-dir\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290673 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bp2w\" (UniqueName: \"kubernetes.io/projected/fab1c20e-bbf0-442f-ada0-5647d493ad6c-kube-api-access-2bp2w\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290686 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d52a94b2-a290-48af-b060-5f3662029280-audit-dir\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290701 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea9abbf-a733-493b-b807-70ee9fa19fd1-config\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290715 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-config\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290732 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8smn2\" (UniqueName: \"kubernetes.io/projected/09928a8e-a70b-4916-9ae2-4dbe952aa514-kube-api-access-8smn2\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290745 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290759 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-config\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290772 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/065ad72f-f4c2-4d51-a856-a915ad7f555b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290787 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290814 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5spc7\" (UniqueName: \"kubernetes.io/projected/065ad72f-f4c2-4d51-a856-a915ad7f555b-kube-api-access-5spc7\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290830 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fab1c20e-bbf0-442f-ada0-5647d493ad6c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290844 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/65b544d3-889f-4b29-ba88-961ad04782bf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-q5cjz\" (UID: \"65b544d3-889f-4b29-ba88-961ad04782bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290861 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290875 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3263d9b9-b7e8-4758-a6a0-85749e84317a-serving-cert\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290892 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fab1c20e-bbf0-442f-ada0-5647d493ad6c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290907 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dcef9514-d760-40e8-9054-75b17a2dde9f-auth-proxy-config\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290922 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fab1c20e-bbf0-442f-ada0-5647d493ad6c-audit-policies\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290937 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj6wg\" (UniqueName: \"kubernetes.io/projected/dcef9514-d760-40e8-9054-75b17a2dde9f-kube-api-access-gj6wg\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290949 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290965 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/065ad72f-f4c2-4d51-a856-a915ad7f555b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290978 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fab1c20e-bbf0-442f-ada0-5647d493ad6c-etcd-client\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.290991 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcef9514-d760-40e8-9054-75b17a2dde9f-config\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291005 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291020 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-service-ca\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291035 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dea9abbf-a733-493b-b807-70ee9fa19fd1-trusted-ca\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291048 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-images\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291063 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-audit-policies\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291077 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-config\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291097 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dcef9514-d760-40e8-9054-75b17a2dde9f-machine-approver-tls\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291109 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-config\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291124 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-oauth-serving-cert\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291136 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291150 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fab1c20e-bbf0-442f-ada0-5647d493ad6c-serving-cert\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291164 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-oauth-config\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291180 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q8tv\" (UniqueName: \"kubernetes.io/projected/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-kube-api-access-4q8tv\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291194 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291209 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291225 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvgf8\" (UniqueName: \"kubernetes.io/projected/3263d9b9-b7e8-4758-a6a0-85749e84317a-kube-api-access-fvgf8\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291238 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/065ad72f-f4c2-4d51-a856-a915ad7f555b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291252 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65b544d3-889f-4b29-ba88-961ad04782bf-serving-cert\") pod \"openshift-config-operator-7777fb866f-q5cjz\" (UID: \"65b544d3-889f-4b29-ba88-961ad04782bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291273 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fds5\" (UniqueName: \"kubernetes.io/projected/dea9abbf-a733-493b-b807-70ee9fa19fd1-kube-api-access-8fds5\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291288 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xk76\" (UniqueName: \"kubernetes.io/projected/65b544d3-889f-4b29-ba88-961ad04782bf-kube-api-access-6xk76\") pod \"openshift-config-operator-7777fb866f-q5cjz\" (UID: \"65b544d3-889f-4b29-ba88-961ad04782bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291303 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291318 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dea9abbf-a733-493b-b807-70ee9fa19fd1-serving-cert\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291331 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx6ft\" (UniqueName: \"kubernetes.io/projected/d52a94b2-a290-48af-b060-5f3662029280-kube-api-access-rx6ft\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291347 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291370 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-client-ca\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291383 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30615409-a282-4405-afab-4802d9c27a3a-serving-cert\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291396 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-serving-cert\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.291466 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.295115 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-wkrx4"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.295535 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.296738 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.298198 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.298577 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.299952 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.301560 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dcjv5"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.302103 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.302514 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.302893 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.303160 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.306110 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.306264 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.307889 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.308863 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.308935 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.309090 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.309333 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.309706 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.310211 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.310533 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.311666 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.311960 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.312228 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.312381 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.312406 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.312812 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.314887 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.315138 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.315371 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.315642 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.315942 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.316065 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.318517 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bghqv"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.318925 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.319201 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.319351 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.321413 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.321757 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.322680 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vfxhs"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.323640 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.323768 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.324063 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.326659 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-chxcn"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.327199 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7kcnv"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.327253 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.327943 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.331649 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.331757 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dbxhk"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.331780 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7m86k"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.341726 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.351767 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.352245 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dbjfd"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.352787 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.354056 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.355714 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-86l88"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.359607 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.361335 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-n4bbj"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.364037 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.371701 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.371731 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r542"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.373712 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.374522 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.374786 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.375948 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.376780 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6lqs9"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.377775 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.378587 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.379165 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.379974 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.380741 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.382307 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kfvwc"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.383316 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.384120 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9dd2n"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.384903 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.385698 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.387000 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.387876 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.388658 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.389441 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.390327 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dcjv5"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.391479 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-chxcn"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392187 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/065ad72f-f4c2-4d51-a856-a915ad7f555b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392220 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fab1c20e-bbf0-442f-ada0-5647d493ad6c-etcd-client\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392238 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcef9514-d760-40e8-9054-75b17a2dde9f-config\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392254 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj6wg\" (UniqueName: \"kubernetes.io/projected/dcef9514-d760-40e8-9054-75b17a2dde9f-kube-api-access-gj6wg\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392269 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392285 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07c84deb-ccb6-4597-a122-fdc9f6acb015-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392301 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-service-ca\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392315 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392329 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msq6m\" (UniqueName: \"kubernetes.io/projected/07c84deb-ccb6-4597-a122-fdc9f6acb015-kube-api-access-msq6m\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392344 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dea9abbf-a733-493b-b807-70ee9fa19fd1-trusted-ca\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392359 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-images\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392376 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/842ad738-0ffb-4986-9372-a26f8bc6119a-proxy-tls\") pod \"machine-config-controller-84d6567774-xv2gh\" (UID: \"842ad738-0ffb-4986-9372-a26f8bc6119a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392390 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-audit-policies\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392405 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e8a37fc-fca6-43a2-83f2-e4c3d7916343-config\") pod \"service-ca-operator-777779d784-mw2kv\" (UID: \"5e8a37fc-fca6-43a2-83f2-e4c3d7916343\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392422 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/21a4d7e9-ea88-4f43-9d43-109df1bb4766-default-certificate\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392434 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21a4d7e9-ea88-4f43-9d43-109df1bb4766-metrics-certs\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392469 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk6mb\" (UniqueName: \"kubernetes.io/projected/83c4eef5-5508-470d-8b7a-b7da9d4706d4-kube-api-access-pk6mb\") pod \"collect-profiles-29412405-wwr7n\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392486 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-config\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392506 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dcef9514-d760-40e8-9054-75b17a2dde9f-machine-approver-tls\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392521 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-config\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392543 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-oauth-serving-cert\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392559 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392572 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fab1c20e-bbf0-442f-ada0-5647d493ad6c-serving-cert\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392588 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-oauth-config\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392602 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/842ad738-0ffb-4986-9372-a26f8bc6119a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xv2gh\" (UID: \"842ad738-0ffb-4986-9372-a26f8bc6119a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392617 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0036df6-fc1c-4945-97b0-7c6ce6e5f806-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bghqv\" (UID: \"c0036df6-fc1c-4945-97b0-7c6ce6e5f806\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392630 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/21a4d7e9-ea88-4f43-9d43-109df1bb4766-stats-auth\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392658 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c84deb-ccb6-4597-a122-fdc9f6acb015-config\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392675 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q8tv\" (UniqueName: \"kubernetes.io/projected/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-kube-api-access-4q8tv\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392690 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzxvx\" (UniqueName: \"kubernetes.io/projected/842ad738-0ffb-4986-9372-a26f8bc6119a-kube-api-access-xzxvx\") pod \"machine-config-controller-84d6567774-xv2gh\" (UID: \"842ad738-0ffb-4986-9372-a26f8bc6119a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392704 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392723 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392738 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvgf8\" (UniqueName: \"kubernetes.io/projected/3263d9b9-b7e8-4758-a6a0-85749e84317a-kube-api-access-fvgf8\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392760 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fds5\" (UniqueName: \"kubernetes.io/projected/dea9abbf-a733-493b-b807-70ee9fa19fd1-kube-api-access-8fds5\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392774 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/065ad72f-f4c2-4d51-a856-a915ad7f555b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392789 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65b544d3-889f-4b29-ba88-961ad04782bf-serving-cert\") pod \"openshift-config-operator-7777fb866f-q5cjz\" (UID: \"65b544d3-889f-4b29-ba88-961ad04782bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392812 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xk76\" (UniqueName: \"kubernetes.io/projected/65b544d3-889f-4b29-ba88-961ad04782bf-kube-api-access-6xk76\") pod \"openshift-config-operator-7777fb866f-q5cjz\" (UID: \"65b544d3-889f-4b29-ba88-961ad04782bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392827 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-config\") pod \"kube-controller-manager-operator-78b949d7b-28t76\" (UID: \"0fd3dcd7-41fd-4e0c-be75-e8464be7696e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392841 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-service-ca\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392853 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21a4d7e9-ea88-4f43-9d43-109df1bb4766-service-ca-bundle\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392868 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392882 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dea9abbf-a733-493b-b807-70ee9fa19fd1-serving-cert\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392895 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-28t76\" (UID: \"0fd3dcd7-41fd-4e0c-be75-e8464be7696e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392911 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5xn2\" (UniqueName: \"kubernetes.io/projected/6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3-kube-api-access-v5xn2\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tsjv\" (UID: \"6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392924 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07c84deb-ccb6-4597-a122-fdc9f6acb015-service-ca-bundle\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392941 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx6ft\" (UniqueName: \"kubernetes.io/projected/d52a94b2-a290-48af-b060-5f3662029280-kube-api-access-rx6ft\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392955 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-webhook-cert\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392972 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392986 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-28t76\" (UID: \"0fd3dcd7-41fd-4e0c-be75-e8464be7696e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.392999 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83c4eef5-5508-470d-8b7a-b7da9d4706d4-config-volume\") pod \"collect-profiles-29412405-wwr7n\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393023 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-client-ca\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393039 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30615409-a282-4405-afab-4802d9c27a3a-serving-cert\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393052 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-serving-cert\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393067 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6g4x\" (UniqueName: \"kubernetes.io/projected/30615409-a282-4405-afab-4802d9c27a3a-kube-api-access-h6g4x\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393083 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kfvwc\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393098 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4kdx\" (UniqueName: \"kubernetes.io/projected/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-kube-api-access-n4kdx\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393113 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpt8q\" (UniqueName: \"kubernetes.io/projected/423a50bb-8a96-49ba-99da-7258729fd2af-kube-api-access-qpt8q\") pod \"openshift-controller-manager-operator-756b6f6bc6-6gvwt\" (UID: \"423a50bb-8a96-49ba-99da-7258729fd2af\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393135 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fab1c20e-bbf0-442f-ada0-5647d493ad6c-encryption-config\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393149 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393175 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/423a50bb-8a96-49ba-99da-7258729fd2af-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6gvwt\" (UID: \"423a50bb-8a96-49ba-99da-7258729fd2af\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393196 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-client-ca\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393209 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-tmpfs\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393222 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-serving-cert\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393236 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393251 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393265 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-trusted-ca-bundle\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393278 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrvnc\" (UniqueName: \"kubernetes.io/projected/7f6163e8-ce0d-481b-8483-4b9e04d381e6-kube-api-access-xrvnc\") pod \"marketplace-operator-79b997595-kfvwc\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393293 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c84deb-ccb6-4597-a122-fdc9f6acb015-serving-cert\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393307 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fab1c20e-bbf0-442f-ada0-5647d493ad6c-audit-dir\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393321 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bp2w\" (UniqueName: \"kubernetes.io/projected/fab1c20e-bbf0-442f-ada0-5647d493ad6c-kube-api-access-2bp2w\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393335 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d52a94b2-a290-48af-b060-5f3662029280-audit-dir\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393349 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea9abbf-a733-493b-b807-70ee9fa19fd1-config\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393362 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrvj\" (UniqueName: \"kubernetes.io/projected/fb67a319-3ec2-4759-bdfb-46452f4f7010-kube-api-access-thrvj\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393377 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-config\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393391 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8smn2\" (UniqueName: \"kubernetes.io/projected/09928a8e-a70b-4916-9ae2-4dbe952aa514-kube-api-access-8smn2\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393404 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393419 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e8a37fc-fca6-43a2-83f2-e4c3d7916343-serving-cert\") pod \"service-ca-operator-777779d784-mw2kv\" (UID: \"5e8a37fc-fca6-43a2-83f2-e4c3d7916343\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393435 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v9ll\" (UniqueName: \"kubernetes.io/projected/5e8a37fc-fca6-43a2-83f2-e4c3d7916343-kube-api-access-6v9ll\") pod \"service-ca-operator-777779d784-mw2kv\" (UID: \"5e8a37fc-fca6-43a2-83f2-e4c3d7916343\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393467 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-ca\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393483 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-config\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393499 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/065ad72f-f4c2-4d51-a856-a915ad7f555b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393514 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393528 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-apiservice-cert\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393543 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-client\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393733 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z2px\" (UniqueName: \"kubernetes.io/projected/80a45948-f52d-4e57-8611-37ea99eefb3c-kube-api-access-4z2px\") pod \"migrator-59844c95c7-4845w\" (UID: \"80a45948-f52d-4e57-8611-37ea99eefb3c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393753 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5spc7\" (UniqueName: \"kubernetes.io/projected/065ad72f-f4c2-4d51-a856-a915ad7f555b-kube-api-access-5spc7\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393768 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fab1c20e-bbf0-442f-ada0-5647d493ad6c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393783 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/65b544d3-889f-4b29-ba88-961ad04782bf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-q5cjz\" (UID: \"65b544d3-889f-4b29-ba88-961ad04782bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393807 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393824 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3263d9b9-b7e8-4758-a6a0-85749e84317a-serving-cert\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393838 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fab1c20e-bbf0-442f-ada0-5647d493ad6c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393852 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dcef9514-d760-40e8-9054-75b17a2dde9f-auth-proxy-config\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393868 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kfvwc\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393883 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-config\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393899 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fab1c20e-bbf0-442f-ada0-5647d493ad6c-audit-policies\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393922 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423a50bb-8a96-49ba-99da-7258729fd2af-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6gvwt\" (UID: \"423a50bb-8a96-49ba-99da-7258729fd2af\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393937 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tsjv\" (UID: \"6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393951 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tsjv\" (UID: \"6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393971 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85526\" (UniqueName: \"kubernetes.io/projected/c0036df6-fc1c-4945-97b0-7c6ce6e5f806-kube-api-access-85526\") pod \"multus-admission-controller-857f4d67dd-bghqv\" (UID: \"c0036df6-fc1c-4945-97b0-7c6ce6e5f806\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393985 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd2xx\" (UniqueName: \"kubernetes.io/projected/21a4d7e9-ea88-4f43-9d43-109df1bb4766-kube-api-access-nd2xx\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393999 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83c4eef5-5508-470d-8b7a-b7da9d4706d4-secret-volume\") pod \"collect-profiles-29412405-wwr7n\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.394889 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.394920 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.395469 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fab1c20e-bbf0-442f-ada0-5647d493ad6c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.395645 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-client-ca\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.395673 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/65b544d3-889f-4b29-ba88-961ad04782bf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-q5cjz\" (UID: \"65b544d3-889f-4b29-ba88-961ad04782bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.395880 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fab1c20e-bbf0-442f-ada0-5647d493ad6c-audit-policies\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.396292 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.397016 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-client-ca\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.397382 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fab1c20e-bbf0-442f-ada0-5647d493ad6c-audit-dir\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.397737 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d52a94b2-a290-48af-b060-5f3662029280-audit-dir\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.398281 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-trusted-ca-bundle\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.398351 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea9abbf-a733-493b-b807-70ee9fa19fd1-config\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.398933 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.399011 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-config\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.399044 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fab1c20e-bbf0-442f-ada0-5647d493ad6c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.393498 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.399184 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jf25k"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.399196 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-86l88"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.399206 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bghqv"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.399215 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-685k2"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.399466 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3263d9b9-b7e8-4758-a6a0-85749e84317a-serving-cert\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.399511 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-oauth-serving-cert\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.399761 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cplh2"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.400068 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.400125 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.400135 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cplh2" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.400164 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-685k2" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.400501 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/065ad72f-f4c2-4d51-a856-a915ad7f555b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.400919 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dcef9514-d760-40e8-9054-75b17a2dde9f-auth-proxy-config\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.401492 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-config\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.401880 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.401974 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-audit-policies\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.402021 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dea9abbf-a733-493b-b807-70ee9fa19fd1-trusted-ca\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.402422 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-config\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.402437 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-service-ca\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.402691 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fab1c20e-bbf0-442f-ada0-5647d493ad6c-encryption-config\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.402757 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.402792 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cplh2"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.402827 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-685k2"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.403401 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vfxhs"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.403671 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/065ad72f-f4c2-4d51-a856-a915ad7f555b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.404059 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-config\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.404057 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-serving-cert\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.404194 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.404291 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.404996 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-images\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.405205 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dea9abbf-a733-493b-b807-70ee9fa19fd1-serving-cert\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.405286 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcef9514-d760-40e8-9054-75b17a2dde9f-config\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.405393 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dcef9514-d760-40e8-9054-75b17a2dde9f-machine-approver-tls\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.405697 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-oauth-config\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.407273 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.407412 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fab1c20e-bbf0-442f-ada0-5647d493ad6c-serving-cert\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.407759 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.407862 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30615409-a282-4405-afab-4802d9c27a3a-serving-cert\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.409437 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.409644 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.409648 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65b544d3-889f-4b29-ba88-961ad04782bf-serving-cert\") pod \"openshift-config-operator-7777fb866f-q5cjz\" (UID: \"65b544d3-889f-4b29-ba88-961ad04782bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.409940 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fab1c20e-bbf0-442f-ada0-5647d493ad6c-etcd-client\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.410733 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.411719 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.417372 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-ftshs"] Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.418850 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ftshs" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.431166 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.451216 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.471404 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.490677 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494616 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-webhook-cert\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494645 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-28t76\" (UID: \"0fd3dcd7-41fd-4e0c-be75-e8464be7696e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494665 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83c4eef5-5508-470d-8b7a-b7da9d4706d4-config-volume\") pod \"collect-profiles-29412405-wwr7n\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494697 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kfvwc\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494712 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4kdx\" (UniqueName: \"kubernetes.io/projected/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-kube-api-access-n4kdx\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494725 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpt8q\" (UniqueName: \"kubernetes.io/projected/423a50bb-8a96-49ba-99da-7258729fd2af-kube-api-access-qpt8q\") pod \"openshift-controller-manager-operator-756b6f6bc6-6gvwt\" (UID: \"423a50bb-8a96-49ba-99da-7258729fd2af\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494745 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/423a50bb-8a96-49ba-99da-7258729fd2af-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6gvwt\" (UID: \"423a50bb-8a96-49ba-99da-7258729fd2af\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494766 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-serving-cert\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494778 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-tmpfs\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494793 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrvnc\" (UniqueName: \"kubernetes.io/projected/7f6163e8-ce0d-481b-8483-4b9e04d381e6-kube-api-access-xrvnc\") pod \"marketplace-operator-79b997595-kfvwc\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494815 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c84deb-ccb6-4597-a122-fdc9f6acb015-serving-cert\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494835 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrvj\" (UniqueName: \"kubernetes.io/projected/fb67a319-3ec2-4759-bdfb-46452f4f7010-kube-api-access-thrvj\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494859 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e8a37fc-fca6-43a2-83f2-e4c3d7916343-serving-cert\") pod \"service-ca-operator-777779d784-mw2kv\" (UID: \"5e8a37fc-fca6-43a2-83f2-e4c3d7916343\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494873 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v9ll\" (UniqueName: \"kubernetes.io/projected/5e8a37fc-fca6-43a2-83f2-e4c3d7916343-kube-api-access-6v9ll\") pod \"service-ca-operator-777779d784-mw2kv\" (UID: \"5e8a37fc-fca6-43a2-83f2-e4c3d7916343\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494886 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-ca\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494902 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-apiservice-cert\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494914 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-client\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494926 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z2px\" (UniqueName: \"kubernetes.io/projected/80a45948-f52d-4e57-8611-37ea99eefb3c-kube-api-access-4z2px\") pod \"migrator-59844c95c7-4845w\" (UID: \"80a45948-f52d-4e57-8611-37ea99eefb3c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494950 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kfvwc\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494963 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-config\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494976 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423a50bb-8a96-49ba-99da-7258729fd2af-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6gvwt\" (UID: \"423a50bb-8a96-49ba-99da-7258729fd2af\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.494991 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tsjv\" (UID: \"6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495004 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tsjv\" (UID: \"6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495016 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd2xx\" (UniqueName: \"kubernetes.io/projected/21a4d7e9-ea88-4f43-9d43-109df1bb4766-kube-api-access-nd2xx\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495029 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83c4eef5-5508-470d-8b7a-b7da9d4706d4-secret-volume\") pod \"collect-profiles-29412405-wwr7n\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495042 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85526\" (UniqueName: \"kubernetes.io/projected/c0036df6-fc1c-4945-97b0-7c6ce6e5f806-kube-api-access-85526\") pod \"multus-admission-controller-857f4d67dd-bghqv\" (UID: \"c0036df6-fc1c-4945-97b0-7c6ce6e5f806\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495062 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07c84deb-ccb6-4597-a122-fdc9f6acb015-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495077 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msq6m\" (UniqueName: \"kubernetes.io/projected/07c84deb-ccb6-4597-a122-fdc9f6acb015-kube-api-access-msq6m\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495091 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/842ad738-0ffb-4986-9372-a26f8bc6119a-proxy-tls\") pod \"machine-config-controller-84d6567774-xv2gh\" (UID: \"842ad738-0ffb-4986-9372-a26f8bc6119a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495105 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e8a37fc-fca6-43a2-83f2-e4c3d7916343-config\") pod \"service-ca-operator-777779d784-mw2kv\" (UID: \"5e8a37fc-fca6-43a2-83f2-e4c3d7916343\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495117 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/21a4d7e9-ea88-4f43-9d43-109df1bb4766-default-certificate\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495129 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21a4d7e9-ea88-4f43-9d43-109df1bb4766-metrics-certs\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495142 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk6mb\" (UniqueName: \"kubernetes.io/projected/83c4eef5-5508-470d-8b7a-b7da9d4706d4-kube-api-access-pk6mb\") pod \"collect-profiles-29412405-wwr7n\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495164 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/842ad738-0ffb-4986-9372-a26f8bc6119a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xv2gh\" (UID: \"842ad738-0ffb-4986-9372-a26f8bc6119a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495177 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0036df6-fc1c-4945-97b0-7c6ce6e5f806-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bghqv\" (UID: \"c0036df6-fc1c-4945-97b0-7c6ce6e5f806\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495189 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/21a4d7e9-ea88-4f43-9d43-109df1bb4766-stats-auth\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495201 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c84deb-ccb6-4597-a122-fdc9f6acb015-config\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495220 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzxvx\" (UniqueName: \"kubernetes.io/projected/842ad738-0ffb-4986-9372-a26f8bc6119a-kube-api-access-xzxvx\") pod \"machine-config-controller-84d6567774-xv2gh\" (UID: \"842ad738-0ffb-4986-9372-a26f8bc6119a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495254 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-config\") pod \"kube-controller-manager-operator-78b949d7b-28t76\" (UID: \"0fd3dcd7-41fd-4e0c-be75-e8464be7696e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495267 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-service-ca\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495290 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21a4d7e9-ea88-4f43-9d43-109df1bb4766-service-ca-bundle\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495308 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-28t76\" (UID: \"0fd3dcd7-41fd-4e0c-be75-e8464be7696e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.495321 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5xn2\" (UniqueName: \"kubernetes.io/projected/6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3-kube-api-access-v5xn2\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tsjv\" (UID: \"6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.496381 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c84deb-ccb6-4597-a122-fdc9f6acb015-config\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.497280 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/842ad738-0ffb-4986-9372-a26f8bc6119a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xv2gh\" (UID: \"842ad738-0ffb-4986-9372-a26f8bc6119a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.497491 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07c84deb-ccb6-4597-a122-fdc9f6acb015-service-ca-bundle\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.498178 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07c84deb-ccb6-4597-a122-fdc9f6acb015-service-ca-bundle\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.498684 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/842ad738-0ffb-4986-9372-a26f8bc6119a-proxy-tls\") pod \"machine-config-controller-84d6567774-xv2gh\" (UID: \"842ad738-0ffb-4986-9372-a26f8bc6119a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.499159 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kfvwc\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.500293 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-tmpfs\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.500358 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07c84deb-ccb6-4597-a122-fdc9f6acb015-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.503311 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c84deb-ccb6-4597-a122-fdc9f6acb015-serving-cert\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.504838 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kfvwc\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.511577 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.530697 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.550991 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.590994 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.610623 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.619073 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/21a4d7e9-ea88-4f43-9d43-109df1bb4766-default-certificate\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.631292 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.639060 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21a4d7e9-ea88-4f43-9d43-109df1bb4766-metrics-certs\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.651429 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.659160 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21a4d7e9-ea88-4f43-9d43-109df1bb4766-service-ca-bundle\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.671625 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.691013 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.700074 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/21a4d7e9-ea88-4f43-9d43-109df1bb4766-stats-auth\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.710807 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.730640 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.751035 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.771066 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.791246 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.811044 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.830989 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.850816 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.870849 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.890766 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.910949 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.930659 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.942567 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/423a50bb-8a96-49ba-99da-7258729fd2af-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6gvwt\" (UID: \"423a50bb-8a96-49ba-99da-7258729fd2af\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.950684 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.960361 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423a50bb-8a96-49ba-99da-7258729fd2af-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6gvwt\" (UID: \"423a50bb-8a96-49ba-99da-7258729fd2af\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.971702 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 06:47:28 crc kubenswrapper[4475]: I1203 06:47:28.990951 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.010695 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.031399 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.051253 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.071152 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.091162 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.110609 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.132389 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.151553 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.171439 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.191228 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.201799 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83c4eef5-5508-470d-8b7a-b7da9d4706d4-secret-volume\") pod \"collect-profiles-29412405-wwr7n\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.211046 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.231275 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.251415 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.271471 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.277657 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-webhook-cert\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.277994 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-apiservice-cert\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.291184 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.299828 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tsjv\" (UID: \"6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.311360 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.330269 4475 request.go:700] Waited for 1.014049267s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.331152 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.339236 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e8a37fc-fca6-43a2-83f2-e4c3d7916343-serving-cert\") pod \"service-ca-operator-777779d784-mw2kv\" (UID: \"5e8a37fc-fca6-43a2-83f2-e4c3d7916343\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.351589 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.357916 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e8a37fc-fca6-43a2-83f2-e4c3d7916343-config\") pod \"service-ca-operator-777779d784-mw2kv\" (UID: \"5e8a37fc-fca6-43a2-83f2-e4c3d7916343\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.371355 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.391415 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.400583 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tsjv\" (UID: \"6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.410692 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.431784 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.451632 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.471521 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.491200 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.496028 4475 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.496086 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83c4eef5-5508-470d-8b7a-b7da9d4706d4-config-volume podName:83c4eef5-5508-470d-8b7a-b7da9d4706d4 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:29.996071977 +0000 UTC m=+134.800970312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/83c4eef5-5508-470d-8b7a-b7da9d4706d4-config-volume") pod "collect-profiles-29412405-wwr7n" (UID: "83c4eef5-5508-470d-8b7a-b7da9d4706d4") : failed to sync configmap cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.498162 4475 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-client: failed to sync secret cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.498272 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-client podName:fb67a319-3ec2-4759-bdfb-46452f4f7010 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:29.998259477 +0000 UTC m=+134.803157811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-client") pod "etcd-operator-b45778765-chxcn" (UID: "fb67a319-3ec2-4759-bdfb-46452f4f7010") : failed to sync secret cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.498182 4475 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.498332 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0036df6-fc1c-4945-97b0-7c6ce6e5f806-webhook-certs podName:c0036df6-fc1c-4945-97b0-7c6ce6e5f806 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:29.998323187 +0000 UTC m=+134.803221521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c0036df6-fc1c-4945-97b0-7c6ce6e5f806-webhook-certs") pod "multus-admission-controller-857f4d67dd-bghqv" (UID: "c0036df6-fc1c-4945-97b0-7c6ce6e5f806") : failed to sync secret cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.498364 4475 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.498380 4475 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.498410 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-config podName:0fd3dcd7-41fd-4e0c-be75-e8464be7696e nodeName:}" failed. No retries permitted until 2025-12-03 06:47:29.998398387 +0000 UTC m=+134.803296722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-config") pod "kube-controller-manager-operator-78b949d7b-28t76" (UID: "0fd3dcd7-41fd-4e0c-be75-e8464be7696e") : failed to sync configmap cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.498429 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-service-ca podName:fb67a319-3ec2-4759-bdfb-46452f4f7010 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:29.998418295 +0000 UTC m=+134.803316629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-service-ca" (UniqueName: "kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-service-ca") pod "etcd-operator-b45778765-chxcn" (UID: "fb67a319-3ec2-4759-bdfb-46452f4f7010") : failed to sync configmap cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.499278 4475 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.499307 4475 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.499317 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-serving-cert podName:0fd3dcd7-41fd-4e0c-be75-e8464be7696e nodeName:}" failed. No retries permitted until 2025-12-03 06:47:29.999308322 +0000 UTC m=+134.804206656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-serving-cert") pod "kube-controller-manager-operator-78b949d7b-28t76" (UID: "0fd3dcd7-41fd-4e0c-be75-e8464be7696e") : failed to sync secret cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.499338 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-config podName:fb67a319-3ec2-4759-bdfb-46452f4f7010 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:29.999327598 +0000 UTC m=+134.804225923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-config") pod "etcd-operator-b45778765-chxcn" (UID: "fb67a319-3ec2-4759-bdfb-46452f4f7010") : failed to sync configmap cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.500377 4475 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.500406 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-serving-cert podName:fb67a319-3ec2-4759-bdfb-46452f4f7010 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:30.000398606 +0000 UTC m=+134.805296940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-serving-cert") pod "etcd-operator-b45778765-chxcn" (UID: "fb67a319-3ec2-4759-bdfb-46452f4f7010") : failed to sync secret cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.500413 4475 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: E1203 06:47:29.500440 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-ca podName:fb67a319-3ec2-4759-bdfb-46452f4f7010 nodeName:}" failed. No retries permitted until 2025-12-03 06:47:30.000432219 +0000 UTC m=+134.805330553 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-ca" (UniqueName: "kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-ca") pod "etcd-operator-b45778765-chxcn" (UID: "fb67a319-3ec2-4759-bdfb-46452f4f7010") : failed to sync configmap cache: timed out waiting for the condition Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.511414 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.531353 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.551375 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.571551 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.590815 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.611121 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.631107 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.651615 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.670816 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.690925 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.711110 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.731221 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.762027 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.771292 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.791750 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.810718 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.831322 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.851119 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.871750 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.891319 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.911143 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.931046 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.951163 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 06:47:29 crc kubenswrapper[4475]: I1203 06:47:29.991657 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.011020 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-serving-cert\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.011072 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-ca\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.011090 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-client\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.011115 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-config\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.011166 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0036df6-fc1c-4945-97b0-7c6ce6e5f806-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bghqv\" (UID: \"c0036df6-fc1c-4945-97b0-7c6ce6e5f806\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.011214 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-config\") pod \"kube-controller-manager-operator-78b949d7b-28t76\" (UID: \"0fd3dcd7-41fd-4e0c-be75-e8464be7696e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.011228 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-service-ca\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.011244 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-28t76\" (UID: \"0fd3dcd7-41fd-4e0c-be75-e8464be7696e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.011273 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83c4eef5-5508-470d-8b7a-b7da9d4706d4-config-volume\") pod \"collect-profiles-29412405-wwr7n\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.011793 4475 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.011926 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83c4eef5-5508-470d-8b7a-b7da9d4706d4-config-volume\") pod \"collect-profiles-29412405-wwr7n\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.012282 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-config\") pod \"kube-controller-manager-operator-78b949d7b-28t76\" (UID: \"0fd3dcd7-41fd-4e0c-be75-e8464be7696e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.012295 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-config\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.012343 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-ca\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.012752 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-service-ca\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.013493 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-serving-cert\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.014122 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0036df6-fc1c-4945-97b0-7c6ce6e5f806-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bghqv\" (UID: \"c0036df6-fc1c-4945-97b0-7c6ce6e5f806\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.014361 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-28t76\" (UID: \"0fd3dcd7-41fd-4e0c-be75-e8464be7696e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.014656 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fb67a319-3ec2-4759-bdfb-46452f4f7010-etcd-client\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.031619 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.061987 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx6ft\" (UniqueName: \"kubernetes.io/projected/d52a94b2-a290-48af-b060-5f3662029280-kube-api-access-rx6ft\") pod \"oauth-openshift-558db77b4-6r542\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.081796 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5spc7\" (UniqueName: \"kubernetes.io/projected/065ad72f-f4c2-4d51-a856-a915ad7f555b-kube-api-access-5spc7\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.102917 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bp2w\" (UniqueName: \"kubernetes.io/projected/fab1c20e-bbf0-442f-ada0-5647d493ad6c-kube-api-access-2bp2w\") pod \"apiserver-7bbb656c7d-tjg56\" (UID: \"fab1c20e-bbf0-442f-ada0-5647d493ad6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.123076 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8smn2\" (UniqueName: \"kubernetes.io/projected/09928a8e-a70b-4916-9ae2-4dbe952aa514-kube-api-access-8smn2\") pod \"console-f9d7485db-dbxhk\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.131674 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.137506 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.151065 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.171085 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.191591 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.211527 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.231056 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.253509 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r542"] Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.253664 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 06:47:30 crc kubenswrapper[4475]: W1203 06:47:30.258717 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd52a94b2_a290_48af_b060_5f3662029280.slice/crio-63edc3f11007b4cf7c102de13b7c35f159d6d6a8f90da408e1e05ab6353abcb1 WatchSource:0}: Error finding container 63edc3f11007b4cf7c102de13b7c35f159d6d6a8f90da408e1e05ab6353abcb1: Status 404 returned error can't find the container with id 63edc3f11007b4cf7c102de13b7c35f159d6d6a8f90da408e1e05ab6353abcb1 Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.283427 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fds5\" (UniqueName: \"kubernetes.io/projected/dea9abbf-a733-493b-b807-70ee9fa19fd1-kube-api-access-8fds5\") pod \"console-operator-58897d9998-7m86k\" (UID: \"dea9abbf-a733-493b-b807-70ee9fa19fd1\") " pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.298975 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.302281 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q8tv\" (UniqueName: \"kubernetes.io/projected/64eee81d-9ee2-4f0a-a95d-f32f9159e2a4-kube-api-access-4q8tv\") pod \"machine-api-operator-5694c8668f-n4bbj\" (UID: \"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.311275 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.322181 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/065ad72f-f4c2-4d51-a856-a915ad7f555b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lw5ml\" (UID: \"065ad72f-f4c2-4d51-a856-a915ad7f555b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.342612 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xk76\" (UniqueName: \"kubernetes.io/projected/65b544d3-889f-4b29-ba88-961ad04782bf-kube-api-access-6xk76\") pod \"openshift-config-operator-7777fb866f-q5cjz\" (UID: \"65b544d3-889f-4b29-ba88-961ad04782bf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.345443 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.350688 4475 request.go:700] Waited for 1.946407972s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.364881 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6g4x\" (UniqueName: \"kubernetes.io/projected/30615409-a282-4405-afab-4802d9c27a3a-kube-api-access-h6g4x\") pod \"route-controller-manager-6576b87f9c-dlggp\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.376584 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.388744 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvgf8\" (UniqueName: \"kubernetes.io/projected/3263d9b9-b7e8-4758-a6a0-85749e84317a-kube-api-access-fvgf8\") pod \"controller-manager-879f6c89f-7kcnv\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.406282 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj6wg\" (UniqueName: \"kubernetes.io/projected/dcef9514-d760-40e8-9054-75b17a2dde9f-kube-api-access-gj6wg\") pod \"machine-approver-56656f9798-8pdgn\" (UID: \"dcef9514-d760-40e8-9054-75b17a2dde9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.406429 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.412996 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.430739 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.440126 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.451631 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.454570 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.458654 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dbxhk"] Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.483161 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fd3dcd7-41fd-4e0c-be75-e8464be7696e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-28t76\" (UID: \"0fd3dcd7-41fd-4e0c-be75-e8464be7696e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.484300 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7m86k"] Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.504926 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrvnc\" (UniqueName: \"kubernetes.io/projected/7f6163e8-ce0d-481b-8483-4b9e04d381e6-kube-api-access-xrvnc\") pod \"marketplace-operator-79b997595-kfvwc\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.525918 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4kdx\" (UniqueName: \"kubernetes.io/projected/8ccba36e-7b88-4f9b-9706-1e441aa2c59a-kube-api-access-n4kdx\") pod \"packageserver-d55dfcdfc-x5sj6\" (UID: \"8ccba36e-7b88-4f9b-9706-1e441aa2c59a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.539212 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56"] Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.541742 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.548644 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk6mb\" (UniqueName: \"kubernetes.io/projected/83c4eef5-5508-470d-8b7a-b7da9d4706d4-kube-api-access-pk6mb\") pod \"collect-profiles-29412405-wwr7n\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.568724 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrvj\" (UniqueName: \"kubernetes.io/projected/fb67a319-3ec2-4759-bdfb-46452f4f7010-kube-api-access-thrvj\") pod \"etcd-operator-b45778765-chxcn\" (UID: \"fb67a319-3ec2-4759-bdfb-46452f4f7010\") " pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.577282 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp"] Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.582365 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z2px\" (UniqueName: \"kubernetes.io/projected/80a45948-f52d-4e57-8611-37ea99eefb3c-kube-api-access-4z2px\") pod \"migrator-59844c95c7-4845w\" (UID: \"80a45948-f52d-4e57-8611-37ea99eefb3c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w" Dec 03 06:47:30 crc kubenswrapper[4475]: W1203 06:47:30.593691 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30615409_a282_4405_afab_4802d9c27a3a.slice/crio-b6b4d8d85dc53a8ce1100b0f3116b2a699fbfdb38cc1793b0ab6ea1706d2ff62 WatchSource:0}: Error finding container b6b4d8d85dc53a8ce1100b0f3116b2a699fbfdb38cc1793b0ab6ea1706d2ff62: Status 404 returned error can't find the container with id b6b4d8d85dc53a8ce1100b0f3116b2a699fbfdb38cc1793b0ab6ea1706d2ff62 Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.607598 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzxvx\" (UniqueName: \"kubernetes.io/projected/842ad738-0ffb-4986-9372-a26f8bc6119a-kube-api-access-xzxvx\") pod \"machine-config-controller-84d6567774-xv2gh\" (UID: \"842ad738-0ffb-4986-9372-a26f8bc6119a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.613194 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml"] Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.622052 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.625735 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.627492 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5xn2\" (UniqueName: \"kubernetes.io/projected/6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3-kube-api-access-v5xn2\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tsjv\" (UID: \"6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.631104 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" Dec 03 06:47:30 crc kubenswrapper[4475]: W1203 06:47:30.631506 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod065ad72f_f4c2_4d51_a856_a915ad7f555b.slice/crio-477617fa33c9ca37962abba60fdb08a5ca904cc8140a6f285a4997cf4c3bca80 WatchSource:0}: Error finding container 477617fa33c9ca37962abba60fdb08a5ca904cc8140a6f285a4997cf4c3bca80: Status 404 returned error can't find the container with id 477617fa33c9ca37962abba60fdb08a5ca904cc8140a6f285a4997cf4c3bca80 Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.638038 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.638883 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.644173 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd2xx\" (UniqueName: \"kubernetes.io/projected/21a4d7e9-ea88-4f43-9d43-109df1bb4766-kube-api-access-nd2xx\") pod \"router-default-5444994796-wkrx4\" (UID: \"21a4d7e9-ea88-4f43-9d43-109df1bb4766\") " pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.647715 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.662614 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-n4bbj"] Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.665893 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msq6m\" (UniqueName: \"kubernetes.io/projected/07c84deb-ccb6-4597-a122-fdc9f6acb015-kube-api-access-msq6m\") pod \"authentication-operator-69f744f599-6lqs9\" (UID: \"07c84deb-ccb6-4597-a122-fdc9f6acb015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.666035 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.686877 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85526\" (UniqueName: \"kubernetes.io/projected/c0036df6-fc1c-4945-97b0-7c6ce6e5f806-kube-api-access-85526\") pod \"multus-admission-controller-857f4d67dd-bghqv\" (UID: \"c0036df6-fc1c-4945-97b0-7c6ce6e5f806\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.707753 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpt8q\" (UniqueName: \"kubernetes.io/projected/423a50bb-8a96-49ba-99da-7258729fd2af-kube-api-access-qpt8q\") pod \"openshift-controller-manager-operator-756b6f6bc6-6gvwt\" (UID: \"423a50bb-8a96-49ba-99da-7258729fd2af\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.725543 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v9ll\" (UniqueName: \"kubernetes.io/projected/5e8a37fc-fca6-43a2-83f2-e4c3d7916343-kube-api-access-6v9ll\") pod \"service-ca-operator-777779d784-mw2kv\" (UID: \"5e8a37fc-fca6-43a2-83f2-e4c3d7916343\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.763873 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kfvwc"] Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.789808 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" Dec 03 06:47:30 crc kubenswrapper[4475]: W1203 06:47:30.794084 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f6163e8_ce0d_481b_8483_4b9e04d381e6.slice/crio-ca2780568ea913491e5769586ae8e06b6ef8522ad3cde3c33c744bccf05aedb7 WatchSource:0}: Error finding container ca2780568ea913491e5769586ae8e06b6ef8522ad3cde3c33c744bccf05aedb7: Status 404 returned error can't find the container with id ca2780568ea913491e5769586ae8e06b6ef8522ad3cde3c33c744bccf05aedb7 Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821158 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe214ce1-0821-4547-ac8b-e001a0579495-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821185 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xklvd\" (UID: \"abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821203 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rmsw\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-kube-api-access-5rmsw\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821219 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h44ph\" (UniqueName: \"kubernetes.io/projected/abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf-kube-api-access-h44ph\") pod \"cluster-samples-operator-665b6dd947-xklvd\" (UID: \"abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821235 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmztt\" (UniqueName: \"kubernetes.io/projected/7512ca6f-f6ab-4fd1-8b6a-96f221b07b95-kube-api-access-lmztt\") pod \"service-ca-9c57cc56f-9dd2n\" (UID: \"7512ca6f-f6ab-4fd1-8b6a-96f221b07b95\") " pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821262 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c4rt\" (UniqueName: \"kubernetes.io/projected/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-kube-api-access-6c4rt\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821279 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/825ff335-38b1-481e-b6e4-ac1cee0d4408-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6bm4d\" (UID: \"825ff335-38b1-481e-b6e4-ac1cee0d4408\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821297 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-trusted-ca\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821311 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf62ac16-a57b-4f21-8cb1-97dfbc8b779a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mk54s\" (UID: \"cf62ac16-a57b-4f21-8cb1-97dfbc8b779a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821336 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7512ca6f-f6ab-4fd1-8b6a-96f221b07b95-signing-key\") pod \"service-ca-9c57cc56f-9dd2n\" (UID: \"7512ca6f-f6ab-4fd1-8b6a-96f221b07b95\") " pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821350 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf62ac16-a57b-4f21-8cb1-97dfbc8b779a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mk54s\" (UID: \"cf62ac16-a57b-4f21-8cb1-97dfbc8b779a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821372 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f1ef448-9656-4d56-9d7e-d0992ec24085-encryption-config\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821396 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hf6l\" (UniqueName: \"kubernetes.io/projected/bc1b6bc4-044a-429f-bcc3-9afc4be0acef-kube-api-access-2hf6l\") pod \"catalog-operator-68c6474976-q47j6\" (UID: \"bc1b6bc4-044a-429f-bcc3-9afc4be0acef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821410 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-config\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821425 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bc1b6bc4-044a-429f-bcc3-9afc4be0acef-profile-collector-cert\") pod \"catalog-operator-68c6474976-q47j6\" (UID: \"bc1b6bc4-044a-429f-bcc3-9afc4be0acef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821440 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-audit\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821495 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-registry-tls\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821510 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chmnj\" (UniqueName: \"kubernetes.io/projected/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-kube-api-access-chmnj\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821527 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68fc03c8-5d8c-4a1d-8987-474a75454d0f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-48hz9\" (UID: \"68fc03c8-5d8c-4a1d-8987-474a75454d0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821540 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-bound-sa-token\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821561 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821573 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bc1b6bc4-044a-429f-bcc3-9afc4be0acef-srv-cert\") pod \"catalog-operator-68c6474976-q47j6\" (UID: \"bc1b6bc4-044a-429f-bcc3-9afc4be0acef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821585 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-metrics-tls\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821599 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-trusted-ca\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821615 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f1ef448-9656-4d56-9d7e-d0992ec24085-etcd-client\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821629 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-registry-certificates\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821649 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-etcd-serving-ca\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821662 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d6480a9d-cca4-48ca-92e3-85d84e96f012-profile-collector-cert\") pod \"olm-operator-6b444d44fb-t2thq\" (UID: \"d6480a9d-cca4-48ca-92e3-85d84e96f012\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821678 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcc3f4b-e69d-41c0-8940-03a15582a5fa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s9rqq\" (UID: \"4bcc3f4b-e69d-41c0-8940-03a15582a5fa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821690 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq9gs\" (UniqueName: \"kubernetes.io/projected/4bcc3f4b-e69d-41c0-8940-03a15582a5fa-kube-api-access-fq9gs\") pod \"openshift-apiserver-operator-796bbdcf4f-s9rqq\" (UID: \"4bcc3f4b-e69d-41c0-8940-03a15582a5fa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821704 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8phn\" (UniqueName: \"kubernetes.io/projected/c279294d-fe07-48d3-800d-7d73eba69c17-kube-api-access-q8phn\") pod \"downloads-7954f5f757-dbjfd\" (UID: \"c279294d-fe07-48d3-800d-7d73eba69c17\") " pod="openshift-console/downloads-7954f5f757-dbjfd" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821718 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49030b3f-8084-40f7-a015-beaa64adcfa1-metrics-tls\") pod \"dns-operator-744455d44c-vfxhs\" (UID: \"49030b3f-8084-40f7-a015-beaa64adcfa1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821730 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-proxy-tls\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821743 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcn9q\" (UniqueName: \"kubernetes.io/projected/825ff335-38b1-481e-b6e4-ac1cee0d4408-kube-api-access-rcn9q\") pod \"package-server-manager-789f6589d5-6bm4d\" (UID: \"825ff335-38b1-481e-b6e4-ac1cee0d4408\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821758 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hg7\" (UniqueName: \"kubernetes.io/projected/d6480a9d-cca4-48ca-92e3-85d84e96f012-kube-api-access-85hg7\") pod \"olm-operator-6b444d44fb-t2thq\" (UID: \"d6480a9d-cca4-48ca-92e3-85d84e96f012\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821771 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-images\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821792 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5f1ef448-9656-4d56-9d7e-d0992ec24085-node-pullsecrets\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821816 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f1ef448-9656-4d56-9d7e-d0992ec24085-audit-dir\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821830 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f82e633a-87fa-4fe2-b664-b831eaa4d67a-config\") pod \"kube-apiserver-operator-766d6c64bb-4mfpz\" (UID: \"f82e633a-87fa-4fe2-b664-b831eaa4d67a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821844 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821868 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf62ac16-a57b-4f21-8cb1-97dfbc8b779a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mk54s\" (UID: \"cf62ac16-a57b-4f21-8cb1-97dfbc8b779a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821882 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f82e633a-87fa-4fe2-b664-b831eaa4d67a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4mfpz\" (UID: \"f82e633a-87fa-4fe2-b664-b831eaa4d67a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821894 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d6480a9d-cca4-48ca-92e3-85d84e96f012-srv-cert\") pod \"olm-operator-6b444d44fb-t2thq\" (UID: \"d6480a9d-cca4-48ca-92e3-85d84e96f012\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821923 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b59ff\" (UniqueName: \"kubernetes.io/projected/68fc03c8-5d8c-4a1d-8987-474a75454d0f-kube-api-access-b59ff\") pod \"control-plane-machine-set-operator-78cbb6b69f-48hz9\" (UID: \"68fc03c8-5d8c-4a1d-8987-474a75454d0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821937 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f1ef448-9656-4d56-9d7e-d0992ec24085-serving-cert\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821952 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe214ce1-0821-4547-ac8b-e001a0579495-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821965 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7512ca6f-f6ab-4fd1-8b6a-96f221b07b95-signing-cabundle\") pod \"service-ca-9c57cc56f-9dd2n\" (UID: \"7512ca6f-f6ab-4fd1-8b6a-96f221b07b95\") " pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821980 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcc3f4b-e69d-41c0-8940-03a15582a5fa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s9rqq\" (UID: \"4bcc3f4b-e69d-41c0-8940-03a15582a5fa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.821999 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.822012 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx45g\" (UniqueName: \"kubernetes.io/projected/5f1ef448-9656-4d56-9d7e-d0992ec24085-kube-api-access-fx45g\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.822026 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7kqv\" (UniqueName: \"kubernetes.io/projected/49030b3f-8084-40f7-a015-beaa64adcfa1-kube-api-access-g7kqv\") pod \"dns-operator-744455d44c-vfxhs\" (UID: \"49030b3f-8084-40f7-a015-beaa64adcfa1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.822060 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.822080 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-image-import-ca\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.822093 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f82e633a-87fa-4fe2-b664-b831eaa4d67a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4mfpz\" (UID: \"f82e633a-87fa-4fe2-b664-b831eaa4d67a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" Dec 03 06:47:30 crc kubenswrapper[4475]: E1203 06:47:30.826214 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:31.326201685 +0000 UTC m=+136.131100019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.827194 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6"] Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.848216 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.858323 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.865255 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.879995 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.906952 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" event={"ID":"7f6163e8-ce0d-481b-8483-4b9e04d381e6","Type":"ContainerStarted","Data":"ca2780568ea913491e5769586ae8e06b6ef8522ad3cde3c33c744bccf05aedb7"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.920406 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.920713 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7m86k" event={"ID":"dea9abbf-a733-493b-b807-70ee9fa19fd1","Type":"ContainerStarted","Data":"7d7336945e123b8cde30a43540f562bfee294c5c1b7965d91c23331db719ca0c"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.920771 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7m86k" event={"ID":"dea9abbf-a733-493b-b807-70ee9fa19fd1","Type":"ContainerStarted","Data":"2cfad70dd96af7bdbe1b4757e88f0950542d08e802f802b6ed5bb4c5fcebec90"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.921054 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.922551 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.922748 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-registry-tls\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.922769 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chmnj\" (UniqueName: \"kubernetes.io/projected/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-kube-api-access-chmnj\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.922787 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-bound-sa-token\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.922840 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68fc03c8-5d8c-4a1d-8987-474a75454d0f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-48hz9\" (UID: \"68fc03c8-5d8c-4a1d-8987-474a75454d0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.922875 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/94977097-803b-4e49-a295-fd4bb3925da0-certs\") pod \"machine-config-server-ftshs\" (UID: \"94977097-803b-4e49-a295-fd4bb3925da0\") " pod="openshift-machine-config-operator/machine-config-server-ftshs" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.922896 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bc1b6bc4-044a-429f-bcc3-9afc4be0acef-srv-cert\") pod \"catalog-operator-68c6474976-q47j6\" (UID: \"bc1b6bc4-044a-429f-bcc3-9afc4be0acef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.922913 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-metrics-tls\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.922933 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.922962 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-trusted-ca\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927406 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-registry-certificates\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927437 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f1ef448-9656-4d56-9d7e-d0992ec24085-etcd-client\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927494 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-etcd-serving-ca\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927508 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d6480a9d-cca4-48ca-92e3-85d84e96f012-profile-collector-cert\") pod \"olm-operator-6b444d44fb-t2thq\" (UID: \"d6480a9d-cca4-48ca-92e3-85d84e96f012\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927547 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8phn\" (UniqueName: \"kubernetes.io/projected/c279294d-fe07-48d3-800d-7d73eba69c17-kube-api-access-q8phn\") pod \"downloads-7954f5f757-dbjfd\" (UID: \"c279294d-fe07-48d3-800d-7d73eba69c17\") " pod="openshift-console/downloads-7954f5f757-dbjfd" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927576 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcc3f4b-e69d-41c0-8940-03a15582a5fa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s9rqq\" (UID: \"4bcc3f4b-e69d-41c0-8940-03a15582a5fa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927593 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq9gs\" (UniqueName: \"kubernetes.io/projected/4bcc3f4b-e69d-41c0-8940-03a15582a5fa-kube-api-access-fq9gs\") pod \"openshift-apiserver-operator-796bbdcf4f-s9rqq\" (UID: \"4bcc3f4b-e69d-41c0-8940-03a15582a5fa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927609 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49030b3f-8084-40f7-a015-beaa64adcfa1-metrics-tls\") pod \"dns-operator-744455d44c-vfxhs\" (UID: \"49030b3f-8084-40f7-a015-beaa64adcfa1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927634 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-proxy-tls\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927651 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcn9q\" (UniqueName: \"kubernetes.io/projected/825ff335-38b1-481e-b6e4-ac1cee0d4408-kube-api-access-rcn9q\") pod \"package-server-manager-789f6589d5-6bm4d\" (UID: \"825ff335-38b1-481e-b6e4-ac1cee0d4408\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927671 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv8wf\" (UniqueName: \"kubernetes.io/projected/ba21124e-33f1-4cf7-8bc2-483a5810191d-kube-api-access-cv8wf\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927719 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85hg7\" (UniqueName: \"kubernetes.io/projected/d6480a9d-cca4-48ca-92e3-85d84e96f012-kube-api-access-85hg7\") pod \"olm-operator-6b444d44fb-t2thq\" (UID: \"d6480a9d-cca4-48ca-92e3-85d84e96f012\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927753 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-images\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927773 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5f1ef448-9656-4d56-9d7e-d0992ec24085-node-pullsecrets\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927822 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f1ef448-9656-4d56-9d7e-d0992ec24085-audit-dir\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927840 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f82e633a-87fa-4fe2-b664-b831eaa4d67a-config\") pod \"kube-apiserver-operator-766d6c64bb-4mfpz\" (UID: \"f82e633a-87fa-4fe2-b664-b831eaa4d67a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927856 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927889 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-csi-data-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927920 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf62ac16-a57b-4f21-8cb1-97dfbc8b779a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mk54s\" (UID: \"cf62ac16-a57b-4f21-8cb1-97dfbc8b779a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927976 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f82e633a-87fa-4fe2-b664-b831eaa4d67a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4mfpz\" (UID: \"f82e633a-87fa-4fe2-b664-b831eaa4d67a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.927989 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d6480a9d-cca4-48ca-92e3-85d84e96f012-srv-cert\") pod \"olm-operator-6b444d44fb-t2thq\" (UID: \"d6480a9d-cca4-48ca-92e3-85d84e96f012\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928029 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b59ff\" (UniqueName: \"kubernetes.io/projected/68fc03c8-5d8c-4a1d-8987-474a75454d0f-kube-api-access-b59ff\") pod \"control-plane-machine-set-operator-78cbb6b69f-48hz9\" (UID: \"68fc03c8-5d8c-4a1d-8987-474a75454d0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928042 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f1ef448-9656-4d56-9d7e-d0992ec24085-serving-cert\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928057 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nm78\" (UniqueName: \"kubernetes.io/projected/94977097-803b-4e49-a295-fd4bb3925da0-kube-api-access-8nm78\") pod \"machine-config-server-ftshs\" (UID: \"94977097-803b-4e49-a295-fd4bb3925da0\") " pod="openshift-machine-config-operator/machine-config-server-ftshs" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928093 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe214ce1-0821-4547-ac8b-e001a0579495-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928109 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7512ca6f-f6ab-4fd1-8b6a-96f221b07b95-signing-cabundle\") pod \"service-ca-9c57cc56f-9dd2n\" (UID: \"7512ca6f-f6ab-4fd1-8b6a-96f221b07b95\") " pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928133 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-mountpoint-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928147 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcc3f4b-e69d-41c0-8940-03a15582a5fa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s9rqq\" (UID: \"4bcc3f4b-e69d-41c0-8940-03a15582a5fa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928176 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx45g\" (UniqueName: \"kubernetes.io/projected/5f1ef448-9656-4d56-9d7e-d0992ec24085-kube-api-access-fx45g\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928192 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7kqv\" (UniqueName: \"kubernetes.io/projected/49030b3f-8084-40f7-a015-beaa64adcfa1-kube-api-access-g7kqv\") pod \"dns-operator-744455d44c-vfxhs\" (UID: \"49030b3f-8084-40f7-a015-beaa64adcfa1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928208 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-registration-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928223 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b190c59-5d6a-403c-8610-da9297d3828a-config-volume\") pod \"dns-default-685k2\" (UID: \"8b190c59-5d6a-403c-8610-da9297d3828a\") " pod="openshift-dns/dns-default-685k2" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928238 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrcwg\" (UniqueName: \"kubernetes.io/projected/c78eadf7-ca95-40dd-b425-da00c2875c8f-kube-api-access-hrcwg\") pod \"ingress-canary-cplh2\" (UID: \"c78eadf7-ca95-40dd-b425-da00c2875c8f\") " pod="openshift-ingress-canary/ingress-canary-cplh2" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928278 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928292 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-socket-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928307 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f82e633a-87fa-4fe2-b664-b831eaa4d67a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4mfpz\" (UID: \"f82e633a-87fa-4fe2-b664-b831eaa4d67a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928329 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-image-import-ca\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928354 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe214ce1-0821-4547-ac8b-e001a0579495-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928367 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xklvd\" (UID: \"abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928383 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/94977097-803b-4e49-a295-fd4bb3925da0-node-bootstrap-token\") pod \"machine-config-server-ftshs\" (UID: \"94977097-803b-4e49-a295-fd4bb3925da0\") " pod="openshift-machine-config-operator/machine-config-server-ftshs" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928406 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rmsw\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-kube-api-access-5rmsw\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928420 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zzxw\" (UniqueName: \"kubernetes.io/projected/8b190c59-5d6a-403c-8610-da9297d3828a-kube-api-access-8zzxw\") pod \"dns-default-685k2\" (UID: \"8b190c59-5d6a-403c-8610-da9297d3828a\") " pod="openshift-dns/dns-default-685k2" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928445 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h44ph\" (UniqueName: \"kubernetes.io/projected/abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf-kube-api-access-h44ph\") pod \"cluster-samples-operator-665b6dd947-xklvd\" (UID: \"abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928489 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmztt\" (UniqueName: \"kubernetes.io/projected/7512ca6f-f6ab-4fd1-8b6a-96f221b07b95-kube-api-access-lmztt\") pod \"service-ca-9c57cc56f-9dd2n\" (UID: \"7512ca6f-f6ab-4fd1-8b6a-96f221b07b95\") " pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928503 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b190c59-5d6a-403c-8610-da9297d3828a-metrics-tls\") pod \"dns-default-685k2\" (UID: \"8b190c59-5d6a-403c-8610-da9297d3828a\") " pod="openshift-dns/dns-default-685k2" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928517 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/825ff335-38b1-481e-b6e4-ac1cee0d4408-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6bm4d\" (UID: \"825ff335-38b1-481e-b6e4-ac1cee0d4408\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928532 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c4rt\" (UniqueName: \"kubernetes.io/projected/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-kube-api-access-6c4rt\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928547 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-trusted-ca\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928565 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf62ac16-a57b-4f21-8cb1-97dfbc8b779a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mk54s\" (UID: \"cf62ac16-a57b-4f21-8cb1-97dfbc8b779a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928591 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7512ca6f-f6ab-4fd1-8b6a-96f221b07b95-signing-key\") pod \"service-ca-9c57cc56f-9dd2n\" (UID: \"7512ca6f-f6ab-4fd1-8b6a-96f221b07b95\") " pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928642 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf62ac16-a57b-4f21-8cb1-97dfbc8b779a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mk54s\" (UID: \"cf62ac16-a57b-4f21-8cb1-97dfbc8b779a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928657 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-plugins-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928681 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f1ef448-9656-4d56-9d7e-d0992ec24085-encryption-config\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928697 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hf6l\" (UniqueName: \"kubernetes.io/projected/bc1b6bc4-044a-429f-bcc3-9afc4be0acef-kube-api-access-2hf6l\") pod \"catalog-operator-68c6474976-q47j6\" (UID: \"bc1b6bc4-044a-429f-bcc3-9afc4be0acef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928720 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-config\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928738 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c78eadf7-ca95-40dd-b425-da00c2875c8f-cert\") pod \"ingress-canary-cplh2\" (UID: \"c78eadf7-ca95-40dd-b425-da00c2875c8f\") " pod="openshift-ingress-canary/ingress-canary-cplh2" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928753 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bc1b6bc4-044a-429f-bcc3-9afc4be0acef-profile-collector-cert\") pod \"catalog-operator-68c6474976-q47j6\" (UID: \"bc1b6bc4-044a-429f-bcc3-9afc4be0acef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.928817 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-audit\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.929682 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-audit\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: E1203 06:47:30.929764 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:31.429752027 +0000 UTC m=+136.234650361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.945475 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.945544 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5f1ef448-9656-4d56-9d7e-d0992ec24085-node-pullsecrets\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.945477 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-registry-tls\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.946837 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f1ef448-9656-4d56-9d7e-d0992ec24085-audit-dir\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.948377 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7512ca6f-f6ab-4fd1-8b6a-96f221b07b95-signing-cabundle\") pod \"service-ca-9c57cc56f-9dd2n\" (UID: \"7512ca6f-f6ab-4fd1-8b6a-96f221b07b95\") " pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.948673 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.949328 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe214ce1-0821-4547-ac8b-e001a0579495-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.952857 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-etcd-serving-ca\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.953127 4475 patch_prober.go:28] interesting pod/console-operator-58897d9998-7m86k container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.956054 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7m86k" podUID="dea9abbf-a733-493b-b807-70ee9fa19fd1" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.954402 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcc3f4b-e69d-41c0-8940-03a15582a5fa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s9rqq\" (UID: \"4bcc3f4b-e69d-41c0-8940-03a15582a5fa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.954692 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-trusted-ca\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.954855 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xklvd\" (UID: \"abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.955105 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf62ac16-a57b-4f21-8cb1-97dfbc8b779a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mk54s\" (UID: \"cf62ac16-a57b-4f21-8cb1-97dfbc8b779a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.955503 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.956005 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f82e633a-87fa-4fe2-b664-b831eaa4d67a-config\") pod \"kube-apiserver-operator-766d6c64bb-4mfpz\" (UID: \"f82e633a-87fa-4fe2-b664-b831eaa4d67a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.953847 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-trusted-ca\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.957588 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f1ef448-9656-4d56-9d7e-d0992ec24085-encryption-config\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.960431 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe214ce1-0821-4547-ac8b-e001a0579495-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.964056 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-image-import-ca\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.964994 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f1ef448-9656-4d56-9d7e-d0992ec24085-config\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.967707 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-images\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.968680 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" event={"ID":"d52a94b2-a290-48af-b060-5f3662029280","Type":"ContainerStarted","Data":"cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.968710 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" event={"ID":"d52a94b2-a290-48af-b060-5f3662029280","Type":"ContainerStarted","Data":"63edc3f11007b4cf7c102de13b7c35f159d6d6a8f90da408e1e05ab6353abcb1"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.971911 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68fc03c8-5d8c-4a1d-8987-474a75454d0f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-48hz9\" (UID: \"68fc03c8-5d8c-4a1d-8987-474a75454d0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.972043 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.972921 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcc3f4b-e69d-41c0-8940-03a15582a5fa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s9rqq\" (UID: \"4bcc3f4b-e69d-41c0-8940-03a15582a5fa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.973921 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" event={"ID":"8ccba36e-7b88-4f9b-9706-1e441aa2c59a","Type":"ContainerStarted","Data":"31355020f5a8e622c1cc1007ebebab37bbb516a47e32a449b6d8589fefa1d5e8"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.974314 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-registry-certificates\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.976069 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49030b3f-8084-40f7-a015-beaa64adcfa1-metrics-tls\") pod \"dns-operator-744455d44c-vfxhs\" (UID: \"49030b3f-8084-40f7-a015-beaa64adcfa1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.976496 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bc1b6bc4-044a-429f-bcc3-9afc4be0acef-srv-cert\") pod \"catalog-operator-68c6474976-q47j6\" (UID: \"bc1b6bc4-044a-429f-bcc3-9afc4be0acef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.976875 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7512ca6f-f6ab-4fd1-8b6a-96f221b07b95-signing-key\") pod \"service-ca-9c57cc56f-9dd2n\" (UID: \"7512ca6f-f6ab-4fd1-8b6a-96f221b07b95\") " pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.978266 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d6480a9d-cca4-48ca-92e3-85d84e96f012-profile-collector-cert\") pod \"olm-operator-6b444d44fb-t2thq\" (UID: \"d6480a9d-cca4-48ca-92e3-85d84e96f012\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.979241 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chmnj\" (UniqueName: \"kubernetes.io/projected/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-kube-api-access-chmnj\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.979380 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" event={"ID":"fab1c20e-bbf0-442f-ada0-5647d493ad6c","Type":"ContainerStarted","Data":"b64b8bb1a7f20de236fe44c924f432b23e09722cd5816f0f060167b472d2a47b"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.981770 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f82e633a-87fa-4fe2-b664-b831eaa4d67a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4mfpz\" (UID: \"f82e633a-87fa-4fe2-b664-b831eaa4d67a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.982146 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-proxy-tls\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.982261 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf62ac16-a57b-4f21-8cb1-97dfbc8b779a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mk54s\" (UID: \"cf62ac16-a57b-4f21-8cb1-97dfbc8b779a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.982950 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-metrics-tls\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.983133 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/825ff335-38b1-481e-b6e4-ac1cee0d4408-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6bm4d\" (UID: \"825ff335-38b1-481e-b6e4-ac1cee0d4408\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.983598 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d6480a9d-cca4-48ca-92e3-85d84e96f012-srv-cert\") pod \"olm-operator-6b444d44fb-t2thq\" (UID: \"d6480a9d-cca4-48ca-92e3-85d84e96f012\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.984646 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f1ef448-9656-4d56-9d7e-d0992ec24085-etcd-client\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.986317 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" event={"ID":"065ad72f-f4c2-4d51-a856-a915ad7f555b","Type":"ContainerStarted","Data":"b6f61a661089cdb44748b2305e458149cd051518c148da35f821792aee5c9c56"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.986341 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" event={"ID":"065ad72f-f4c2-4d51-a856-a915ad7f555b","Type":"ContainerStarted","Data":"477617fa33c9ca37962abba60fdb08a5ca904cc8140a6f285a4997cf4c3bca80"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.989037 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bc1b6bc4-044a-429f-bcc3-9afc4be0acef-profile-collector-cert\") pod \"catalog-operator-68c6474976-q47j6\" (UID: \"bc1b6bc4-044a-429f-bcc3-9afc4be0acef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.989574 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dbxhk" event={"ID":"09928a8e-a70b-4916-9ae2-4dbe952aa514","Type":"ContainerStarted","Data":"ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.989601 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dbxhk" event={"ID":"09928a8e-a70b-4916-9ae2-4dbe952aa514","Type":"ContainerStarted","Data":"737f45ed55d673b657d03e384499671b6d6c3703c989815d081e556197f7fb49"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.992611 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f1ef448-9656-4d56-9d7e-d0992ec24085-serving-cert\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.995772 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" event={"ID":"dcef9514-d760-40e8-9054-75b17a2dde9f","Type":"ContainerStarted","Data":"426e5ffc6ad0ab59aab84047843330f8fa1bae925259e327416672e2237d9d64"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.995811 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" event={"ID":"dcef9514-d760-40e8-9054-75b17a2dde9f","Type":"ContainerStarted","Data":"6e54b3b5c584afdc4a51c4511ce1106f6d8b7674613c943c3b0c7fea67be4b61"} Dec 03 06:47:30 crc kubenswrapper[4475]: I1203 06:47:30.997144 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h44ph\" (UniqueName: \"kubernetes.io/projected/abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf-kube-api-access-h44ph\") pod \"cluster-samples-operator-665b6dd947-xklvd\" (UID: \"abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.002494 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-bound-sa-token\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.009346 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" event={"ID":"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4","Type":"ContainerStarted","Data":"0ce3afe9673f8ab279f7ba6e34a23660bb6595e4e69dfef54dfb90e7ecfc3023"} Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.012333 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf62ac16-a57b-4f21-8cb1-97dfbc8b779a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mk54s\" (UID: \"cf62ac16-a57b-4f21-8cb1-97dfbc8b779a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.014205 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" event={"ID":"30615409-a282-4405-afab-4802d9c27a3a","Type":"ContainerStarted","Data":"134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7"} Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.014229 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" event={"ID":"30615409-a282-4405-afab-4802d9c27a3a","Type":"ContainerStarted","Data":"b6b4d8d85dc53a8ce1100b0f3116b2a699fbfdb38cc1793b0ab6ea1706d2ff62"} Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.014537 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.018672 4475 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6r542 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.018708 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" podUID="d52a94b2-a290-48af-b060-5f3662029280" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.018811 4475 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-dlggp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.018875 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" podUID="30615409-a282-4405-afab-4802d9c27a3a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.023310 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmztt\" (UniqueName: \"kubernetes.io/projected/7512ca6f-f6ab-4fd1-8b6a-96f221b07b95-kube-api-access-lmztt\") pod \"service-ca-9c57cc56f-9dd2n\" (UID: \"7512ca6f-f6ab-4fd1-8b6a-96f221b07b95\") " pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034622 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-csi-data-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034677 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nm78\" (UniqueName: \"kubernetes.io/projected/94977097-803b-4e49-a295-fd4bb3925da0-kube-api-access-8nm78\") pod \"machine-config-server-ftshs\" (UID: \"94977097-803b-4e49-a295-fd4bb3925da0\") " pod="openshift-machine-config-operator/machine-config-server-ftshs" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034713 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-mountpoint-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034731 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034754 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-registration-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034767 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b190c59-5d6a-403c-8610-da9297d3828a-config-volume\") pod \"dns-default-685k2\" (UID: \"8b190c59-5d6a-403c-8610-da9297d3828a\") " pod="openshift-dns/dns-default-685k2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034782 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrcwg\" (UniqueName: \"kubernetes.io/projected/c78eadf7-ca95-40dd-b425-da00c2875c8f-kube-api-access-hrcwg\") pod \"ingress-canary-cplh2\" (UID: \"c78eadf7-ca95-40dd-b425-da00c2875c8f\") " pod="openshift-ingress-canary/ingress-canary-cplh2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034814 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-socket-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034840 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/94977097-803b-4e49-a295-fd4bb3925da0-node-bootstrap-token\") pod \"machine-config-server-ftshs\" (UID: \"94977097-803b-4e49-a295-fd4bb3925da0\") " pod="openshift-machine-config-operator/machine-config-server-ftshs" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034858 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zzxw\" (UniqueName: \"kubernetes.io/projected/8b190c59-5d6a-403c-8610-da9297d3828a-kube-api-access-8zzxw\") pod \"dns-default-685k2\" (UID: \"8b190c59-5d6a-403c-8610-da9297d3828a\") " pod="openshift-dns/dns-default-685k2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034873 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b190c59-5d6a-403c-8610-da9297d3828a-metrics-tls\") pod \"dns-default-685k2\" (UID: \"8b190c59-5d6a-403c-8610-da9297d3828a\") " pod="openshift-dns/dns-default-685k2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034895 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-plugins-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034924 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c78eadf7-ca95-40dd-b425-da00c2875c8f-cert\") pod \"ingress-canary-cplh2\" (UID: \"c78eadf7-ca95-40dd-b425-da00c2875c8f\") " pod="openshift-ingress-canary/ingress-canary-cplh2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.034940 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/94977097-803b-4e49-a295-fd4bb3925da0-certs\") pod \"machine-config-server-ftshs\" (UID: \"94977097-803b-4e49-a295-fd4bb3925da0\") " pod="openshift-machine-config-operator/machine-config-server-ftshs" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.035002 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv8wf\" (UniqueName: \"kubernetes.io/projected/ba21124e-33f1-4cf7-8bc2-483a5810191d-kube-api-access-cv8wf\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.035192 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-csi-data-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.035987 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-mountpoint-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: E1203 06:47:31.036168 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:31.536158353 +0000 UTC m=+136.341056686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.036429 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-registration-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.037205 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b190c59-5d6a-403c-8610-da9297d3828a-config-volume\") pod \"dns-default-685k2\" (UID: \"8b190c59-5d6a-403c-8610-da9297d3828a\") " pod="openshift-dns/dns-default-685k2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.037359 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-socket-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.038382 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba21124e-33f1-4cf7-8bc2-483a5810191d-plugins-dir\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.044427 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c78eadf7-ca95-40dd-b425-da00c2875c8f-cert\") pod \"ingress-canary-cplh2\" (UID: \"c78eadf7-ca95-40dd-b425-da00c2875c8f\") " pod="openshift-ingress-canary/ingress-canary-cplh2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.045988 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b190c59-5d6a-403c-8610-da9297d3828a-metrics-tls\") pod \"dns-default-685k2\" (UID: \"8b190c59-5d6a-403c-8610-da9297d3828a\") " pod="openshift-dns/dns-default-685k2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.059105 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd3e6416-4e59-4ef8-a778-91dc78b6fc71-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lgjdg\" (UID: \"fd3e6416-4e59-4ef8-a778-91dc78b6fc71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.059264 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/94977097-803b-4e49-a295-fd4bb3925da0-certs\") pod \"machine-config-server-ftshs\" (UID: \"94977097-803b-4e49-a295-fd4bb3925da0\") " pod="openshift-machine-config-operator/machine-config-server-ftshs" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.060942 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/94977097-803b-4e49-a295-fd4bb3925da0-node-bootstrap-token\") pod \"machine-config-server-ftshs\" (UID: \"94977097-803b-4e49-a295-fd4bb3925da0\") " pod="openshift-machine-config-operator/machine-config-server-ftshs" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.079262 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f82e633a-87fa-4fe2-b664-b831eaa4d67a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4mfpz\" (UID: \"f82e633a-87fa-4fe2-b664-b831eaa4d67a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.082068 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8phn\" (UniqueName: \"kubernetes.io/projected/c279294d-fe07-48d3-800d-7d73eba69c17-kube-api-access-q8phn\") pod \"downloads-7954f5f757-dbjfd\" (UID: \"c279294d-fe07-48d3-800d-7d73eba69c17\") " pod="openshift-console/downloads-7954f5f757-dbjfd" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.084729 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.105406 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq9gs\" (UniqueName: \"kubernetes.io/projected/4bcc3f4b-e69d-41c0-8940-03a15582a5fa-kube-api-access-fq9gs\") pod \"openshift-apiserver-operator-796bbdcf4f-s9rqq\" (UID: \"4bcc3f4b-e69d-41c0-8940-03a15582a5fa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.127361 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.129901 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx45g\" (UniqueName: \"kubernetes.io/projected/5f1ef448-9656-4d56-9d7e-d0992ec24085-kube-api-access-fx45g\") pod \"apiserver-76f77b778f-jf25k\" (UID: \"5f1ef448-9656-4d56-9d7e-d0992ec24085\") " pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.135882 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:31 crc kubenswrapper[4475]: E1203 06:47:31.137086 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:31.637070872 +0000 UTC m=+136.441969206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.152602 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.167573 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rmsw\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-kube-api-access-5rmsw\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.173872 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.176698 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7kqv\" (UniqueName: \"kubernetes.io/projected/49030b3f-8084-40f7-a015-beaa64adcfa1-kube-api-access-g7kqv\") pod \"dns-operator-744455d44c-vfxhs\" (UID: \"49030b3f-8084-40f7-a015-beaa64adcfa1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.192911 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.193353 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hf6l\" (UniqueName: \"kubernetes.io/projected/bc1b6bc4-044a-429f-bcc3-9afc4be0acef-kube-api-access-2hf6l\") pod \"catalog-operator-68c6474976-q47j6\" (UID: \"bc1b6bc4-044a-429f-bcc3-9afc4be0acef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.196725 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.209342 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7kcnv"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.219732 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c4rt\" (UniqueName: \"kubernetes.io/projected/b67ad2cb-0cdd-4906-81c9-5c4597207aa3-kube-api-access-6c4rt\") pod \"machine-config-operator-74547568cd-gcnjp\" (UID: \"b67ad2cb-0cdd-4906-81c9-5c4597207aa3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:31 crc kubenswrapper[4475]: W1203 06:47:31.235077 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fd3dcd7_41fd_4e0c_be75_e8464be7696e.slice/crio-d59e341cccc27ba6bcc48096f97da7941bee6c558c21fcba5dc331662912fd6d WatchSource:0}: Error finding container d59e341cccc27ba6bcc48096f97da7941bee6c558c21fcba5dc331662912fd6d: Status 404 returned error can't find the container with id d59e341cccc27ba6bcc48096f97da7941bee6c558c21fcba5dc331662912fd6d Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.238203 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:31 crc kubenswrapper[4475]: E1203 06:47:31.238505 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:31.738495892 +0000 UTC m=+136.543394226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.244544 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hg7\" (UniqueName: \"kubernetes.io/projected/d6480a9d-cca4-48ca-92e3-85d84e96f012-kube-api-access-85hg7\") pod \"olm-operator-6b444d44fb-t2thq\" (UID: \"d6480a9d-cca4-48ca-92e3-85d84e96f012\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.253137 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.259630 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.286137 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b59ff\" (UniqueName: \"kubernetes.io/projected/68fc03c8-5d8c-4a1d-8987-474a75454d0f-kube-api-access-b59ff\") pod \"control-plane-machine-set-operator-78cbb6b69f-48hz9\" (UID: \"68fc03c8-5d8c-4a1d-8987-474a75454d0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.293330 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcn9q\" (UniqueName: \"kubernetes.io/projected/825ff335-38b1-481e-b6e4-ac1cee0d4408-kube-api-access-rcn9q\") pod \"package-server-manager-789f6589d5-6bm4d\" (UID: \"825ff335-38b1-481e-b6e4-ac1cee0d4408\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.312604 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv8wf\" (UniqueName: \"kubernetes.io/projected/ba21124e-33f1-4cf7-8bc2-483a5810191d-kube-api-access-cv8wf\") pod \"csi-hostpathplugin-86l88\" (UID: \"ba21124e-33f1-4cf7-8bc2-483a5810191d\") " pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.335832 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nm78\" (UniqueName: \"kubernetes.io/projected/94977097-803b-4e49-a295-fd4bb3925da0-kube-api-access-8nm78\") pod \"machine-config-server-ftshs\" (UID: \"94977097-803b-4e49-a295-fd4bb3925da0\") " pod="openshift-machine-config-operator/machine-config-server-ftshs" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.342887 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:31 crc kubenswrapper[4475]: E1203 06:47:31.343179 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:31.843165011 +0000 UTC m=+136.648063346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.352057 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrcwg\" (UniqueName: \"kubernetes.io/projected/c78eadf7-ca95-40dd-b425-da00c2875c8f-kube-api-access-hrcwg\") pod \"ingress-canary-cplh2\" (UID: \"c78eadf7-ca95-40dd-b425-da00c2875c8f\") " pod="openshift-ingress-canary/ingress-canary-cplh2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.362659 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.373873 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zzxw\" (UniqueName: \"kubernetes.io/projected/8b190c59-5d6a-403c-8610-da9297d3828a-kube-api-access-8zzxw\") pod \"dns-default-685k2\" (UID: \"8b190c59-5d6a-403c-8610-da9297d3828a\") " pod="openshift-dns/dns-default-685k2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.374967 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.379900 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dbjfd" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.430198 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.438766 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-chxcn"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.438811 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.439519 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.444174 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:31 crc kubenswrapper[4475]: E1203 06:47:31.444406 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:31.944396759 +0000 UTC m=+136.749295093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.476676 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6lqs9"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.494991 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.497559 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.515694 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.520251 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.526310 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.542863 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.544913 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:31 crc kubenswrapper[4475]: E1203 06:47:31.545327 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:32.045312975 +0000 UTC m=+136.850211309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.550467 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.581656 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-86l88" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.586911 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cplh2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.594415 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-685k2" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.601397 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ftshs" Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.624665 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd"] Dec 03 06:47:31 crc kubenswrapper[4475]: W1203 06:47:31.645032 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6abdfe97_cf24_4ec5_8aee_f67cf30bb2e3.slice/crio-d7964de67cc2f4ab75ab26c4b3dcdc93b449f48eb54ea604dff06a36644912e3 WatchSource:0}: Error finding container d7964de67cc2f4ab75ab26c4b3dcdc93b449f48eb54ea604dff06a36644912e3: Status 404 returned error can't find the container with id d7964de67cc2f4ab75ab26c4b3dcdc93b449f48eb54ea604dff06a36644912e3 Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.647265 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:31 crc kubenswrapper[4475]: E1203 06:47:31.647527 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:32.147517557 +0000 UTC m=+136.952415890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.674501 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.707640 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9dd2n"] Dec 03 06:47:31 crc kubenswrapper[4475]: W1203 06:47:31.744316 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e8a37fc_fca6_43a2_83f2_e4c3d7916343.slice/crio-50f5b6d130acb4378efb12e55c85db5f4223ecbd5b87a697781d5eaf62384457 WatchSource:0}: Error finding container 50f5b6d130acb4378efb12e55c85db5f4223ecbd5b87a697781d5eaf62384457: Status 404 returned error can't find the container with id 50f5b6d130acb4378efb12e55c85db5f4223ecbd5b87a697781d5eaf62384457 Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.748365 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:31 crc kubenswrapper[4475]: E1203 06:47:31.748630 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:32.248616886 +0000 UTC m=+137.053515221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.846283 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.859056 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:31 crc kubenswrapper[4475]: E1203 06:47:31.860840 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:32.360826947 +0000 UTC m=+137.165725281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.887231 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bghqv"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.899312 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s"] Dec 03 06:47:31 crc kubenswrapper[4475]: I1203 06:47:31.967107 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:31 crc kubenswrapper[4475]: E1203 06:47:31.967622 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:32.467609417 +0000 UTC m=+137.272507751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.064246 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" podStartSLOduration=118.064215898 podStartE2EDuration="1m58.064215898s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:32.062526652 +0000 UTC m=+136.867424986" watchObservedRunningTime="2025-12-03 06:47:32.064215898 +0000 UTC m=+136.869114232" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.067196 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" event={"ID":"0fd3dcd7-41fd-4e0c-be75-e8464be7696e","Type":"ContainerStarted","Data":"70dba1fede53f576be79db68186d25ef658db82fadcbfdb3501faa299a658a50"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.067229 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" event={"ID":"0fd3dcd7-41fd-4e0c-be75-e8464be7696e","Type":"ContainerStarted","Data":"d59e341cccc27ba6bcc48096f97da7941bee6c558c21fcba5dc331662912fd6d"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.068061 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:32 crc kubenswrapper[4475]: E1203 06:47:32.068351 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:32.568341478 +0000 UTC m=+137.373239812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.073890 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" event={"ID":"842ad738-0ffb-4986-9372-a26f8bc6119a","Type":"ContainerStarted","Data":"6fa232e9aa5d0a14e2b0264a08def2102e58a64dc743875374d2a4e5b4114492"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.073914 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" event={"ID":"842ad738-0ffb-4986-9372-a26f8bc6119a","Type":"ContainerStarted","Data":"629eab40a34a46037c957b6e15a24b47b349b029571b9318683bc64d4c7ade32"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.085091 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wkrx4" event={"ID":"21a4d7e9-ea88-4f43-9d43-109df1bb4766","Type":"ContainerStarted","Data":"5eb87193fab95b171b6c7f475f7f0aa7d9f0db0ab51ebac96793319ec76bdc16"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.085116 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wkrx4" event={"ID":"21a4d7e9-ea88-4f43-9d43-109df1bb4766","Type":"ContainerStarted","Data":"9d5b62220df5f583660bd29cf177bab4e97597153bd43a3d3b0e250150bea273"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.097169 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" event={"ID":"5e8a37fc-fca6-43a2-83f2-e4c3d7916343","Type":"ContainerStarted","Data":"50f5b6d130acb4378efb12e55c85db5f4223ecbd5b87a697781d5eaf62384457"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.103890 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" event={"ID":"3263d9b9-b7e8-4758-a6a0-85749e84317a","Type":"ContainerStarted","Data":"8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.103924 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" event={"ID":"3263d9b9-b7e8-4758-a6a0-85749e84317a","Type":"ContainerStarted","Data":"75518b836b3f0c095d60b2d9b2ceb070eec566ed5f4a41c5f46f1cea0043159e"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.103938 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.106180 4475 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7kcnv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.106218 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" podUID="3263d9b9-b7e8-4758-a6a0-85749e84317a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.123889 4475 generic.go:334] "Generic (PLEG): container finished" podID="fab1c20e-bbf0-442f-ada0-5647d493ad6c" containerID="98664a9a22534d28521b48f40dc6c4703de2a88f0e038a37d1aeefe373a5e2b6" exitCode=0 Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.123954 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" event={"ID":"fab1c20e-bbf0-442f-ada0-5647d493ad6c","Type":"ContainerDied","Data":"98664a9a22534d28521b48f40dc6c4703de2a88f0e038a37d1aeefe373a5e2b6"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.127481 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" event={"ID":"cf62ac16-a57b-4f21-8cb1-97dfbc8b779a","Type":"ContainerStarted","Data":"230e16f785dc1fbe4f06d91520868575d7033f1d530d20090ee53a8a31064211"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.136590 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" event={"ID":"dcef9514-d760-40e8-9054-75b17a2dde9f","Type":"ContainerStarted","Data":"28d7d215cded53060a27fc065232aba14708f0199d5ab5b225ae719d67ccbf59"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.141487 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" event={"ID":"8ccba36e-7b88-4f9b-9706-1e441aa2c59a","Type":"ContainerStarted","Data":"9f6d66a755405a8d4b0da7bf6816eb09a4f0a397bbfad40d96bfb393b2c7f694"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.141929 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.150174 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vfxhs"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.158343 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lw5ml" podStartSLOduration=118.158327182 podStartE2EDuration="1m58.158327182s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:32.154542428 +0000 UTC m=+136.959440762" watchObservedRunningTime="2025-12-03 06:47:32.158327182 +0000 UTC m=+136.963225516" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.172922 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:32 crc kubenswrapper[4475]: E1203 06:47:32.173913 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:32.673899915 +0000 UTC m=+137.478798248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.190379 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dbjfd"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.194838 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" event={"ID":"abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf","Type":"ContainerStarted","Data":"2f569e561e4406051a06456f76d1cbfc1fab7f60edaa979c0f69172930cc6f66"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.211991 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" event={"ID":"7f6163e8-ce0d-481b-8483-4b9e04d381e6","Type":"ContainerStarted","Data":"2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.213267 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.215541 4475 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kfvwc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.215577 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" podUID="7f6163e8-ce0d-481b-8483-4b9e04d381e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.216180 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-dbxhk" podStartSLOduration=118.216169347 podStartE2EDuration="1m58.216169347s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:32.190694278 +0000 UTC m=+136.995592612" watchObservedRunningTime="2025-12-03 06:47:32.216169347 +0000 UTC m=+137.021067681" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.217627 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.240738 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.247936 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" event={"ID":"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4","Type":"ContainerStarted","Data":"4483bf2f23732c3e51561b3ae80cc8660df4883f6be536fe0870813dd4b8e3db"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.247977 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" event={"ID":"64eee81d-9ee2-4f0a-a95d-f32f9159e2a4","Type":"ContainerStarted","Data":"0c16b4ee9a0cba9f3975b53a190227f15a5a96db533ff9d83263da75a252e381"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.253015 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jf25k"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.256629 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.270003 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-7m86k" podStartSLOduration=118.269993052 podStartE2EDuration="1m58.269993052s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:32.269699272 +0000 UTC m=+137.074597606" watchObservedRunningTime="2025-12-03 06:47:32.269993052 +0000 UTC m=+137.074891387" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.271333 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" event={"ID":"fb67a319-3ec2-4759-bdfb-46452f4f7010","Type":"ContainerStarted","Data":"e7aa1d77f0918450620341660d3f457003490cba9a34b4e8e61afc5ca83603ca"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.272857 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.275156 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:32 crc kubenswrapper[4475]: E1203 06:47:32.277305 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:32.777296299 +0000 UTC m=+137.582194633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.286022 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" event={"ID":"c0036df6-fc1c-4945-97b0-7c6ce6e5f806","Type":"ContainerStarted","Data":"a9e40d5c0f13851583d934db9f109aaa6a632f80364a6b26b33689310221c63d"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.293393 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" event={"ID":"423a50bb-8a96-49ba-99da-7258729fd2af","Type":"ContainerStarted","Data":"c07de6cc7ffc7afd0261a156894f4e489af2c7363b334d4a839627e8cc9b6302"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.294340 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" event={"ID":"6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3","Type":"ContainerStarted","Data":"d7964de67cc2f4ab75ab26c4b3dcdc93b449f48eb54ea604dff06a36644912e3"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.296237 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w" event={"ID":"80a45948-f52d-4e57-8611-37ea99eefb3c","Type":"ContainerStarted","Data":"475152484c5e8390bf80e114e77535a75c934f58a5e2c0e9dd4a6c2693d3db69"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.303744 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" event={"ID":"7512ca6f-f6ab-4fd1-8b6a-96f221b07b95","Type":"ContainerStarted","Data":"e287ef67698de1e2232997e9b7023944b142e3f01a3a6559b810c4b84aee4586"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.307570 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" event={"ID":"83c4eef5-5508-470d-8b7a-b7da9d4706d4","Type":"ContainerStarted","Data":"41072a08398a16116e8aeb3f8cae52459d8c4d629f86ee71a567460fc26aae85"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.311868 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" event={"ID":"65b544d3-889f-4b29-ba88-961ad04782bf","Type":"ContainerStarted","Data":"b7df10beaa1e9cca4a717e6fc79452cb099e389054045e0e1a656a2704c7e4e3"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.320615 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" event={"ID":"f82e633a-87fa-4fe2-b664-b831eaa4d67a","Type":"ContainerStarted","Data":"d7bf710f2fd72b86a6e7d65407e0c05cabcee7b44e8ba840d5e643e060c0a245"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.332005 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" event={"ID":"07c84deb-ccb6-4597-a122-fdc9f6acb015","Type":"ContainerStarted","Data":"60c8b9d0186bf6fe9b0555d1a9dfb5460612a586756ccc6c291e69c5d528a98f"} Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.356427 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.359900 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-7m86k" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.359924 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.364918 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.369026 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.403795 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:32 crc kubenswrapper[4475]: E1203 06:47:32.406103 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:32.906087002 +0000 UTC m=+137.710985337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:32 crc kubenswrapper[4475]: W1203 06:47:32.406697 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68fc03c8_5d8c_4a1d_8987_474a75454d0f.slice/crio-a311489ed18ff9929e5789dda6a0a883bdf3f82ace8f787f0921ed73d1270393 WatchSource:0}: Error finding container a311489ed18ff9929e5789dda6a0a883bdf3f82ace8f787f0921ed73d1270393: Status 404 returned error can't find the container with id a311489ed18ff9929e5789dda6a0a883bdf3f82ace8f787f0921ed73d1270393 Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.413759 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp"] Dec 03 06:47:32 crc kubenswrapper[4475]: W1203 06:47:32.481359 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb67ad2cb_0cdd_4906_81c9_5c4597207aa3.slice/crio-d8143098d5e051310c7b211405616604df9e0860072f00fc695169b0ec7ebb0e WatchSource:0}: Error finding container d8143098d5e051310c7b211405616604df9e0860072f00fc695169b0ec7ebb0e: Status 404 returned error can't find the container with id d8143098d5e051310c7b211405616604df9e0860072f00fc695169b0ec7ebb0e Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.506744 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-86l88"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.506984 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:32 crc kubenswrapper[4475]: E1203 06:47:32.509137 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:33.009125465 +0000 UTC m=+137.814023800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.514244 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" podStartSLOduration=119.514231545 podStartE2EDuration="1m59.514231545s" podCreationTimestamp="2025-12-03 06:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:32.505965895 +0000 UTC m=+137.310864239" watchObservedRunningTime="2025-12-03 06:47:32.514231545 +0000 UTC m=+137.319129879" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.608547 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:32 crc kubenswrapper[4475]: E1203 06:47:32.608881 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:33.108868103 +0000 UTC m=+137.913766437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.612045 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-685k2"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.661755 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.712052 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:32 crc kubenswrapper[4475]: E1203 06:47:32.712778 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:33.212760978 +0000 UTC m=+138.017659311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.816223 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:32 crc kubenswrapper[4475]: E1203 06:47:32.819081 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:33.319066163 +0000 UTC m=+138.123964498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.819638 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cplh2"] Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.858929 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.861325 4475 patch_prober.go:28] interesting pod/router-default-5444994796-wkrx4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:47:32 crc kubenswrapper[4475]: [-]has-synced failed: reason withheld Dec 03 06:47:32 crc kubenswrapper[4475]: [+]process-running ok Dec 03 06:47:32 crc kubenswrapper[4475]: healthz check failed Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.861363 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrx4" podUID="21a4d7e9-ea88-4f43-9d43-109df1bb4766" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:47:32 crc kubenswrapper[4475]: I1203 06:47:32.921655 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:32 crc kubenswrapper[4475]: E1203 06:47:32.921914 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:33.421904993 +0000 UTC m=+138.226803326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.027966 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.028090 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:33.528071288 +0000 UTC m=+138.332969622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.028324 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.028576 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:33.528564723 +0000 UTC m=+138.333463057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.133212 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.133557 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:33.633545375 +0000 UTC m=+138.438443709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.217207 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-wkrx4" podStartSLOduration=119.217191508 podStartE2EDuration="1m59.217191508s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.217186369 +0000 UTC m=+138.022084703" watchObservedRunningTime="2025-12-03 06:47:33.217191508 +0000 UTC m=+138.022089842" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.240874 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.241181 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:33.741171797 +0000 UTC m=+138.546070131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.283984 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28t76" podStartSLOduration=119.28396997 podStartE2EDuration="1m59.28396997s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.281929927 +0000 UTC m=+138.086828281" watchObservedRunningTime="2025-12-03 06:47:33.28396997 +0000 UTC m=+138.088868304" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.341346 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.341657 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:33.841643681 +0000 UTC m=+138.646542015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.352287 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" podStartSLOduration=119.352277157 podStartE2EDuration="1m59.352277157s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.351747174 +0000 UTC m=+138.156645508" watchObservedRunningTime="2025-12-03 06:47:33.352277157 +0000 UTC m=+138.157175492" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.396794 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" event={"ID":"842ad738-0ffb-4986-9372-a26f8bc6119a","Type":"ContainerStarted","Data":"ff2205a708f9df94f5afea040b85b4775d90158f98d39fc9accc03fd333340a7"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.401900 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x5sj6" podStartSLOduration=119.401884281 podStartE2EDuration="1m59.401884281s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.395720299 +0000 UTC m=+138.200618634" watchObservedRunningTime="2025-12-03 06:47:33.401884281 +0000 UTC m=+138.206782615" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.444246 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.444695 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:33.944684548 +0000 UTC m=+138.749582883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.468968 4475 generic.go:334] "Generic (PLEG): container finished" podID="65b544d3-889f-4b29-ba88-961ad04782bf" containerID="8421e5ee072e8da7a23df8b686de39dde9615bb678e24780fc12a05079d1f4de" exitCode=0 Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.469400 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" event={"ID":"65b544d3-889f-4b29-ba88-961ad04782bf","Type":"ContainerDied","Data":"8421e5ee072e8da7a23df8b686de39dde9615bb678e24780fc12a05079d1f4de"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.517889 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" podStartSLOduration=119.517874877 podStartE2EDuration="1m59.517874877s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.517626001 +0000 UTC m=+138.322524335" watchObservedRunningTime="2025-12-03 06:47:33.517874877 +0000 UTC m=+138.322773211" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.548897 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.549463 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.049431554 +0000 UTC m=+138.854329888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.573418 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" event={"ID":"7512ca6f-f6ab-4fd1-8b6a-96f221b07b95","Type":"ContainerStarted","Data":"e567786990c4b93c1f707ba9a8d77720af3cd47f1af3a69691d07808ec32ba6e"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.602140 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" event={"ID":"d6480a9d-cca4-48ca-92e3-85d84e96f012","Type":"ContainerStarted","Data":"75ba528f1e4ae742745d05c5fb8e847765def3cfa7e83d1e9e74592d5753fa5e"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.602208 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" event={"ID":"d6480a9d-cca4-48ca-92e3-85d84e96f012","Type":"ContainerStarted","Data":"859ec779cae2156e7d6a80862bdc666023f85d2411b52f7a4480f99506c86b62"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.602954 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.605544 4475 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-t2thq container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.605582 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" podUID="d6480a9d-cca4-48ca-92e3-85d84e96f012" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.621604 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dbjfd" event={"ID":"c279294d-fe07-48d3-800d-7d73eba69c17","Type":"ContainerStarted","Data":"c0763a7f5e9e79d618820c62246005706085e659b236da99eb50d2a1c7905f80"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.649572 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" event={"ID":"49030b3f-8084-40f7-a015-beaa64adcfa1","Type":"ContainerStarted","Data":"9d0d8304e7e2199fb336f6e90a6fe87b4ecfeed95caf8046384326b8bf1c4c01"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.650472 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.651738 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.151728368 +0000 UTC m=+138.956626702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.680100 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" event={"ID":"5f1ef448-9656-4d56-9d7e-d0992ec24085","Type":"ContainerStarted","Data":"3b5e424e7ec30aff2947a88e66fb5896b5cd8d8d7b91509c39f21d5a3da4fe6c"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.690654 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-n4bbj" podStartSLOduration=119.690642934 podStartE2EDuration="1m59.690642934s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.648188204 +0000 UTC m=+138.453086537" watchObservedRunningTime="2025-12-03 06:47:33.690642934 +0000 UTC m=+138.495541268" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.702882 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9" event={"ID":"68fc03c8-5d8c-4a1d-8987-474a75454d0f","Type":"ContainerStarted","Data":"a311489ed18ff9929e5789dda6a0a883bdf3f82ace8f787f0921ed73d1270393"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.710502 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-9dd2n" podStartSLOduration=119.710487192 podStartE2EDuration="1m59.710487192s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.710376705 +0000 UTC m=+138.515275038" watchObservedRunningTime="2025-12-03 06:47:33.710487192 +0000 UTC m=+138.515385526" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.712642 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8pdgn" podStartSLOduration=120.712636479 podStartE2EDuration="2m0.712636479s" podCreationTimestamp="2025-12-03 06:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.691302349 +0000 UTC m=+138.496200683" watchObservedRunningTime="2025-12-03 06:47:33.712636479 +0000 UTC m=+138.517534813" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.731940 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" event={"ID":"4bcc3f4b-e69d-41c0-8940-03a15582a5fa","Type":"ContainerStarted","Data":"0801ea1776d58d9ee612e4f3740a64933df300227f219f5740bf8b019239934a"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.731969 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" event={"ID":"4bcc3f4b-e69d-41c0-8940-03a15582a5fa","Type":"ContainerStarted","Data":"214eeab833c3abd3ee5c3118313534f14e2aa2e4a482b6694f07a1a3503978c3"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.733303 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" event={"ID":"83c4eef5-5508-470d-8b7a-b7da9d4706d4","Type":"ContainerStarted","Data":"64bbe628906ffd7c485ec8cc71ede08aea8875194cb357d96894ab844be9e9f5"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.743725 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" event={"ID":"fb67a319-3ec2-4759-bdfb-46452f4f7010","Type":"ContainerStarted","Data":"e2ff66d010798e15864bee0acfc49cb82bb448e4997a0d67d8e546e6110928e4"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.751001 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.751091 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.251079231 +0000 UTC m=+139.055977565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.751329 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.753210 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.253201939 +0000 UTC m=+139.058100273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.762305 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" event={"ID":"07c84deb-ccb6-4597-a122-fdc9f6acb015","Type":"ContainerStarted","Data":"f4338a299809a3752906ebbebe2725207d50feaa37c39cca94fad62856c4c66f"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.763637 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2gh" podStartSLOduration=119.76362456 podStartE2EDuration="1m59.76362456s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.762913799 +0000 UTC m=+138.567812143" watchObservedRunningTime="2025-12-03 06:47:33.76362456 +0000 UTC m=+138.568522895" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.789763 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" event={"ID":"6abdfe97-cf24-4ec5-8aee-f67cf30bb2e3","Type":"ContainerStarted","Data":"21c962bb24396e903868e30ab40c0fc7b9445e95434ac03468e359a4bfde0c28"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.847442 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-685k2" event={"ID":"8b190c59-5d6a-403c-8610-da9297d3828a","Type":"ContainerStarted","Data":"7f1c3a918abcbdccebad4509a56f38358a05d05d63f69ab1d1f9331e981953be"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.854165 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" event={"ID":"825ff335-38b1-481e-b6e4-ac1cee0d4408","Type":"ContainerStarted","Data":"859be37a92ac2b63eb2a60dce8b472713e01556b6fe7e4accdf5348553d7dd3a"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.862259 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.863064 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.363051636 +0000 UTC m=+139.167949971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.865323 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" podStartSLOduration=119.865313496 podStartE2EDuration="1m59.865313496s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.809293114 +0000 UTC m=+138.614191447" watchObservedRunningTime="2025-12-03 06:47:33.865313496 +0000 UTC m=+138.670211829" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.872863 4475 patch_prober.go:28] interesting pod/router-default-5444994796-wkrx4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:47:33 crc kubenswrapper[4475]: [-]has-synced failed: reason withheld Dec 03 06:47:33 crc kubenswrapper[4475]: [+]process-running ok Dec 03 06:47:33 crc kubenswrapper[4475]: healthz check failed Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.883007 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrx4" podUID="21a4d7e9-ea88-4f43-9d43-109df1bb4766" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.882967 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-86l88" event={"ID":"ba21124e-33f1-4cf7-8bc2-483a5810191d","Type":"ContainerStarted","Data":"c8310ba2cc723d8a8beca54c81d78469015b04b65ed19ef50e1b96faff6c1afe"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.911505 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" event={"ID":"f82e633a-87fa-4fe2-b664-b831eaa4d67a","Type":"ContainerStarted","Data":"63cf68851d791eadf9872ba6a03ea12efe5f4fe74ab43356cfecb4da6402816d"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.919647 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" event={"ID":"5e8a37fc-fca6-43a2-83f2-e4c3d7916343","Type":"ContainerStarted","Data":"4d7ee56020af30ea91ef4bceb272683e9a835fe8829f2c58550a6832e0ad70d4"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.942358 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" event={"ID":"abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf","Type":"ContainerStarted","Data":"cc4f321fde713b1667c2dde102c3050b70299fd6d124fff9d8735d1d10960d4b"} Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.957262 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" podStartSLOduration=119.957248611 podStartE2EDuration="1m59.957248611s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.914960403 +0000 UTC m=+138.719858737" watchObservedRunningTime="2025-12-03 06:47:33.957248611 +0000 UTC m=+138.762146945" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.958400 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-6lqs9" podStartSLOduration=120.958391102 podStartE2EDuration="2m0.958391102s" podCreationTimestamp="2025-12-03 06:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.956015721 +0000 UTC m=+138.760914055" watchObservedRunningTime="2025-12-03 06:47:33.958391102 +0000 UTC m=+138.763289436" Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.964030 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:33 crc kubenswrapper[4475]: E1203 06:47:33.965212 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.465200112 +0000 UTC m=+139.270098446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:33 crc kubenswrapper[4475]: I1203 06:47:33.970200 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" event={"ID":"bc1b6bc4-044a-429f-bcc3-9afc4be0acef","Type":"ContainerStarted","Data":"379b450cdc1d92df5483553ffbb965ee741158b8e2b307419c53866763b5253e"} Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.000982 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cplh2" event={"ID":"c78eadf7-ca95-40dd-b425-da00c2875c8f","Type":"ContainerStarted","Data":"b3fe48f720f8bed857d0ce2dbfd3874f81b0a34e409670e980e394d20060d2f2"} Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.014087 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-chxcn" podStartSLOduration=120.014076366 podStartE2EDuration="2m0.014076366s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:34.013409527 +0000 UTC m=+138.818307861" watchObservedRunningTime="2025-12-03 06:47:34.014076366 +0000 UTC m=+138.818974701" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.014813 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" podStartSLOduration=120.014807747 podStartE2EDuration="2m0.014807747s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:33.988038301 +0000 UTC m=+138.792936636" watchObservedRunningTime="2025-12-03 06:47:34.014807747 +0000 UTC m=+138.819706081" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.021986 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" event={"ID":"fd3e6416-4e59-4ef8-a778-91dc78b6fc71","Type":"ContainerStarted","Data":"a7766472857a56b7d21be8a838bc9ec5f0403010c6cd0173853686a0da7e9cab"} Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.023801 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" event={"ID":"b67ad2cb-0cdd-4906-81c9-5c4597207aa3","Type":"ContainerStarted","Data":"092ba3bf4d9cbbc185f66e1d95bd4a71c119b634419c119723333d89e0bc29f0"} Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.023826 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" event={"ID":"b67ad2cb-0cdd-4906-81c9-5c4597207aa3","Type":"ContainerStarted","Data":"d8143098d5e051310c7b211405616604df9e0860072f00fc695169b0ec7ebb0e"} Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.042445 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ftshs" event={"ID":"94977097-803b-4e49-a295-fd4bb3925da0","Type":"ContainerStarted","Data":"285afa1917f7bde29545dc1c5ae20c97bc7219ed232ae88ec36649420c765d35"} Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.052597 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w" event={"ID":"80a45948-f52d-4e57-8611-37ea99eefb3c","Type":"ContainerStarted","Data":"03bcadd6eba4aa7d099f5ab3c1b9b4b5d88f218f736a488846f85150d24f3977"} Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.070018 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.070183 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.570166289 +0000 UTC m=+139.375064623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.070251 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.071032 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.571024347 +0000 UTC m=+139.375922681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.072872 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" event={"ID":"423a50bb-8a96-49ba-99da-7258729fd2af","Type":"ContainerStarted","Data":"b848abc5a77ccb3c1ba753ff1608c67f0931b5e30755328670919f130cb67219"} Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.075858 4475 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kfvwc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.075939 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" podUID="7f6163e8-ce0d-481b-8483-4b9e04d381e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.096658 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.113411 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4mfpz" podStartSLOduration=120.113403304 podStartE2EDuration="2m0.113403304s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:34.048172342 +0000 UTC m=+138.853070676" watchObservedRunningTime="2025-12-03 06:47:34.113403304 +0000 UTC m=+138.918301639" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.114017 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s9rqq" podStartSLOduration=121.114011784 podStartE2EDuration="2m1.114011784s" podCreationTimestamp="2025-12-03 06:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:34.111652093 +0000 UTC m=+138.916550427" watchObservedRunningTime="2025-12-03 06:47:34.114011784 +0000 UTC m=+138.918910118" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.152177 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tsjv" podStartSLOduration=120.1521634 podStartE2EDuration="2m0.1521634s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:34.148644957 +0000 UTC m=+138.953543292" watchObservedRunningTime="2025-12-03 06:47:34.1521634 +0000 UTC m=+138.957061734" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.170872 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.172078 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.672065427 +0000 UTC m=+139.476963761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.273381 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.274186 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w" podStartSLOduration=120.274175771 podStartE2EDuration="2m0.274175771s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:34.272901994 +0000 UTC m=+139.077800328" watchObservedRunningTime="2025-12-03 06:47:34.274175771 +0000 UTC m=+139.079074105" Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.273665 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.773655046 +0000 UTC m=+139.578553379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.373766 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.373975 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.873964385 +0000 UTC m=+139.678862719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.374297 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.374536 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.874529353 +0000 UTC m=+139.679427688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.386000 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" podStartSLOduration=120.385984049 podStartE2EDuration="2m0.385984049s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:34.333655827 +0000 UTC m=+139.138554161" watchObservedRunningTime="2025-12-03 06:47:34.385984049 +0000 UTC m=+139.190882383" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.412516 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6gvwt" podStartSLOduration=120.412498497 podStartE2EDuration="2m0.412498497s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:34.411518581 +0000 UTC m=+139.216416916" watchObservedRunningTime="2025-12-03 06:47:34.412498497 +0000 UTC m=+139.217396831" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.413522 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cplh2" podStartSLOduration=6.41351363 podStartE2EDuration="6.41351363s" podCreationTimestamp="2025-12-03 06:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:34.387348496 +0000 UTC m=+139.192246830" watchObservedRunningTime="2025-12-03 06:47:34.41351363 +0000 UTC m=+139.218411964" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.477882 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.478170 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:34.978158474 +0000 UTC m=+139.783056808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.483222 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-ftshs" podStartSLOduration=6.483209951 podStartE2EDuration="6.483209951s" podCreationTimestamp="2025-12-03 06:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:34.447945465 +0000 UTC m=+139.252843798" watchObservedRunningTime="2025-12-03 06:47:34.483209951 +0000 UTC m=+139.288108284" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.484965 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mw2kv" podStartSLOduration=120.484957555 podStartE2EDuration="2m0.484957555s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:34.482973417 +0000 UTC m=+139.287871751" watchObservedRunningTime="2025-12-03 06:47:34.484957555 +0000 UTC m=+139.289855889" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.580063 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.580360 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:35.08035037 +0000 UTC m=+139.885248705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.680899 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.681197 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:35.181183171 +0000 UTC m=+139.986081505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.681523 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.681888 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:35.181874546 +0000 UTC m=+139.986772881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.782140 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.782581 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:35.282568026 +0000 UTC m=+140.087466359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.861373 4475 patch_prober.go:28] interesting pod/router-default-5444994796-wkrx4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:47:34 crc kubenswrapper[4475]: [-]has-synced failed: reason withheld Dec 03 06:47:34 crc kubenswrapper[4475]: [+]process-running ok Dec 03 06:47:34 crc kubenswrapper[4475]: healthz check failed Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.861412 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrx4" podUID="21a4d7e9-ea88-4f43-9d43-109df1bb4766" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.883714 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.884157 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:35.384147545 +0000 UTC m=+140.189045879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:34 crc kubenswrapper[4475]: I1203 06:47:34.984680 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:34 crc kubenswrapper[4475]: E1203 06:47:34.984901 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:35.484889044 +0000 UTC m=+140.289787378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.078662 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mk54s" event={"ID":"cf62ac16-a57b-4f21-8cb1-97dfbc8b779a","Type":"ContainerStarted","Data":"24b20f58ea84cbffecf4beec8860608ac8981e006a834bb70847a06c79261c27"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.080277 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" event={"ID":"c0036df6-fc1c-4945-97b0-7c6ce6e5f806","Type":"ContainerStarted","Data":"c151b7908a1ce02e954fef97920dc67d2b644ea0c285312d8d105afa54f9f15f"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.080299 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" event={"ID":"c0036df6-fc1c-4945-97b0-7c6ce6e5f806","Type":"ContainerStarted","Data":"497125bc41395200a204266a3abed13e61f46c7a6eb577a5d413f3d9bb76d6e6"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.082054 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" event={"ID":"fd3e6416-4e59-4ef8-a778-91dc78b6fc71","Type":"ContainerStarted","Data":"05cbd22f836f3a5cfbf9ce4c3b1b3818442e2bb587c5ca3640f37d3497957e64"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.082145 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" event={"ID":"fd3e6416-4e59-4ef8-a778-91dc78b6fc71","Type":"ContainerStarted","Data":"cd0a5b0878c7560dfe7e1f7636fc05ec1d927b6ba5a7d6e638202c309d97c41a"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.083121 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" event={"ID":"bc1b6bc4-044a-429f-bcc3-9afc4be0acef","Type":"ContainerStarted","Data":"7880b6ead90c7233c8924f0655ff4a69cfd48a56f3d7b6d9900e67c0d84da8b0"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.083379 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.084007 4475 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-q47j6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.084045 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" podUID="bc1b6bc4-044a-429f-bcc3-9afc4be0acef" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.085243 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:35 crc kubenswrapper[4475]: E1203 06:47:35.085567 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:35.585556375 +0000 UTC m=+140.390454709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.085999 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gcnjp" event={"ID":"b67ad2cb-0cdd-4906-81c9-5c4597207aa3","Type":"ContainerStarted","Data":"fccbd99420c9f27bd964c67412a54e145132be7488ddcbdb78c8a7011a894526"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.088101 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4845w" event={"ID":"80a45948-f52d-4e57-8611-37ea99eefb3c","Type":"ContainerStarted","Data":"4dcff5935b4504622b4ffb667f36fb94fc22f4e43a2937f60621c7c67b87bbc9"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.089971 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-685k2" event={"ID":"8b190c59-5d6a-403c-8610-da9297d3828a","Type":"ContainerStarted","Data":"32a9e8a131b38459558f595775a3d71ae7c1e408eccfa632ebfdf62426364234"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.089994 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-685k2" event={"ID":"8b190c59-5d6a-403c-8610-da9297d3828a","Type":"ContainerStarted","Data":"dfe4fb4fb97bd175108575334351184d8919827af3cb03ba93d9ccc3de5c76d6"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.090015 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-685k2" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.092087 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" event={"ID":"49030b3f-8084-40f7-a015-beaa64adcfa1","Type":"ContainerStarted","Data":"d6865c901de5f7086a77bdc214e4aba7e05c50d6dd287d8d8f10425cf3476449"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.092108 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" event={"ID":"49030b3f-8084-40f7-a015-beaa64adcfa1","Type":"ContainerStarted","Data":"6b8a9731153085d4c15970497997abef74ab4d688e426bb3ce8d2345fe025729"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.093704 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" event={"ID":"65b544d3-889f-4b29-ba88-961ad04782bf","Type":"ContainerStarted","Data":"0de7c57ddf245bd0693e7961f9a939551c9fbf5005a01d4d44b12d589b48ee4b"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.094021 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.099122 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9" event={"ID":"68fc03c8-5d8c-4a1d-8987-474a75454d0f","Type":"ContainerStarted","Data":"17a4ed9cbe5baf8dee81c289f7390060ca016442fcff35b5612ae298315bc212"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.100706 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dbjfd" event={"ID":"c279294d-fe07-48d3-800d-7d73eba69c17","Type":"ContainerStarted","Data":"5dcb525837fd4cef041a16d126f593ecbf6217a4eb54ddad347405d6690e6d8c"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.101211 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dbjfd" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.102586 4475 patch_prober.go:28] interesting pod/downloads-7954f5f757-dbjfd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.102678 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dbjfd" podUID="c279294d-fe07-48d3-800d-7d73eba69c17" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.104055 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" event={"ID":"abd6b2f3-d4e8-4a7d-9e60-f04e50130dbf","Type":"ContainerStarted","Data":"b6538971028743a8665ae3d69f9b7131b202bff44ecfdacab45967e622da3031"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.108089 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ftshs" event={"ID":"94977097-803b-4e49-a295-fd4bb3925da0","Type":"ContainerStarted","Data":"42115c380df1fc95c1c7d7b944daf048e71afcce3f99e27f588cc84b149ef25b"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.119705 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" event={"ID":"825ff335-38b1-481e-b6e4-ac1cee0d4408","Type":"ContainerStarted","Data":"3dbb2772e522f239b7da844ec67f5ae0ca0d89ed5b2266fd5846cf6a4abcb34b"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.119805 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" event={"ID":"825ff335-38b1-481e-b6e4-ac1cee0d4408","Type":"ContainerStarted","Data":"8968bc9a99bad020e953704734a4ab44582b9837861d07b7a59c829f47ee2871"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.120168 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.132693 4475 generic.go:334] "Generic (PLEG): container finished" podID="5f1ef448-9656-4d56-9d7e-d0992ec24085" containerID="5148dbf3b719fe1546baac380f3fade97dfe0d1a495b032f6166490ee2ade31f" exitCode=0 Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.132975 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" event={"ID":"5f1ef448-9656-4d56-9d7e-d0992ec24085","Type":"ContainerDied","Data":"5148dbf3b719fe1546baac380f3fade97dfe0d1a495b032f6166490ee2ade31f"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.138311 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cplh2" event={"ID":"c78eadf7-ca95-40dd-b425-da00c2875c8f","Type":"ContainerStarted","Data":"de818d4c8d15861c7be6c5e458ac0191978b6e760d485d6069b524e98b675836"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.154621 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-86l88" event={"ID":"ba21124e-33f1-4cf7-8bc2-483a5810191d","Type":"ContainerStarted","Data":"c34c16eee8d0abfcbe954c07d6b69030d31aba8786f7b65d79cb470c149e448c"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.174388 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-bghqv" podStartSLOduration=121.174376052 podStartE2EDuration="2m1.174376052s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:35.149492281 +0000 UTC m=+139.954390614" watchObservedRunningTime="2025-12-03 06:47:35.174376052 +0000 UTC m=+139.979274387" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.188700 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:35 crc kubenswrapper[4475]: E1203 06:47:35.189841 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:35.689828009 +0000 UTC m=+140.494726343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.195575 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" event={"ID":"fab1c20e-bbf0-442f-ada0-5647d493ad6c","Type":"ContainerStarted","Data":"93ef7e5031cb27641a25eee77494c0a7f1e730b9481fe2098b1499c04f65b328"} Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.218699 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t2thq" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.223611 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.229987 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-48hz9" podStartSLOduration=121.229962771 podStartE2EDuration="2m1.229962771s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:35.228747244 +0000 UTC m=+140.033645577" watchObservedRunningTime="2025-12-03 06:47:35.229962771 +0000 UTC m=+140.034861105" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.294324 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:35 crc kubenswrapper[4475]: E1203 06:47:35.296291 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:35.796279319 +0000 UTC m=+140.601177652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.349617 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.349785 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.371837 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" podStartSLOduration=121.371824139 podStartE2EDuration="2m1.371824139s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:35.295863919 +0000 UTC m=+140.100762254" watchObservedRunningTime="2025-12-03 06:47:35.371824139 +0000 UTC m=+140.176722473" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.400875 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:35 crc kubenswrapper[4475]: E1203 06:47:35.401272 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:35.901256745 +0000 UTC m=+140.706155079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.429848 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lgjdg" podStartSLOduration=121.42983446 podStartE2EDuration="2m1.42983446s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:35.373199206 +0000 UTC m=+140.178097550" watchObservedRunningTime="2025-12-03 06:47:35.42983446 +0000 UTC m=+140.234732794" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.502366 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:35 crc kubenswrapper[4475]: E1203 06:47:35.502792 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.002776854 +0000 UTC m=+140.807675187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.538600 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dbjfd" podStartSLOduration=121.538588154 podStartE2EDuration="2m1.538588154s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:35.435726722 +0000 UTC m=+140.240625056" watchObservedRunningTime="2025-12-03 06:47:35.538588154 +0000 UTC m=+140.343486488" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.598656 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xklvd" podStartSLOduration=121.598642065 podStartE2EDuration="2m1.598642065s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:35.59778038 +0000 UTC m=+140.402678714" watchObservedRunningTime="2025-12-03 06:47:35.598642065 +0000 UTC m=+140.403540399" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.599258 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" podStartSLOduration=121.599251917 podStartE2EDuration="2m1.599251917s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:35.550582381 +0000 UTC m=+140.355480715" watchObservedRunningTime="2025-12-03 06:47:35.599251917 +0000 UTC m=+140.404150252" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.604781 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:35 crc kubenswrapper[4475]: E1203 06:47:35.605131 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.105111118 +0000 UTC m=+140.910009452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.707035 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:35 crc kubenswrapper[4475]: E1203 06:47:35.707361 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.207350734 +0000 UTC m=+141.012249068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.753228 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.780122 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vfxhs" podStartSLOduration=121.780108412 podStartE2EDuration="2m1.780108412s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:35.734032926 +0000 UTC m=+140.538931260" watchObservedRunningTime="2025-12-03 06:47:35.780108412 +0000 UTC m=+140.585006745" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.780650 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" podStartSLOduration=121.78064649 podStartE2EDuration="2m1.78064649s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:35.779060388 +0000 UTC m=+140.583958722" watchObservedRunningTime="2025-12-03 06:47:35.78064649 +0000 UTC m=+140.585544824" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.807910 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:35 crc kubenswrapper[4475]: E1203 06:47:35.808199 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.308187813 +0000 UTC m=+141.113086147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.863441 4475 patch_prober.go:28] interesting pod/router-default-5444994796-wkrx4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:47:35 crc kubenswrapper[4475]: [-]has-synced failed: reason withheld Dec 03 06:47:35 crc kubenswrapper[4475]: [+]process-running ok Dec 03 06:47:35 crc kubenswrapper[4475]: healthz check failed Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.863500 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrx4" podUID="21a4d7e9-ea88-4f43-9d43-109df1bb4766" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.875315 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" podStartSLOduration=121.875301383 podStartE2EDuration="2m1.875301383s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:35.87212492 +0000 UTC m=+140.677023254" watchObservedRunningTime="2025-12-03 06:47:35.875301383 +0000 UTC m=+140.680199717" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.875743 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-685k2" podStartSLOduration=7.875738342 podStartE2EDuration="7.875738342s" podCreationTimestamp="2025-12-03 06:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:35.827146501 +0000 UTC m=+140.632044835" watchObservedRunningTime="2025-12-03 06:47:35.875738342 +0000 UTC m=+140.680636675" Dec 03 06:47:35 crc kubenswrapper[4475]: I1203 06:47:35.908905 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:35 crc kubenswrapper[4475]: E1203 06:47:35.909223 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.409209767 +0000 UTC m=+141.214108101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.010134 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.010530 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.510517017 +0000 UTC m=+141.315415351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.111901 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.112381 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.612368315 +0000 UTC m=+141.417266649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.212686 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.212788 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.712774997 +0000 UTC m=+141.517673332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.213066 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.213307 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.71329917 +0000 UTC m=+141.518197503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.228942 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" event={"ID":"5f1ef448-9656-4d56-9d7e-d0992ec24085","Type":"ContainerStarted","Data":"73c96b5b9abce75f762240d4a31884f342cc2c661e1fc199fa74367f40085750"} Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.228973 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" event={"ID":"5f1ef448-9656-4d56-9d7e-d0992ec24085","Type":"ContainerStarted","Data":"6b37c25f8f7cebe3965ecfbed0704920c301997d40cd641d98ebf897b097080a"} Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.262205 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-86l88" event={"ID":"ba21124e-33f1-4cf7-8bc2-483a5810191d","Type":"ContainerStarted","Data":"c508051249c42c0ce7f3a42077b795275a1ac07e93a9b2bb853d72591e31509f"} Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.262234 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-86l88" event={"ID":"ba21124e-33f1-4cf7-8bc2-483a5810191d","Type":"ContainerStarted","Data":"3d8337b71f60f63fece2095c096c08984308d7fd2bbb14ac0734f560d50a0d47"} Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.264155 4475 patch_prober.go:28] interesting pod/downloads-7954f5f757-dbjfd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.264186 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dbjfd" podUID="c279294d-fe07-48d3-800d-7d73eba69c17" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.268668 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q47j6" Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.274747 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjg56" Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.314144 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.314283 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.814263937 +0000 UTC m=+141.619162271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.314818 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.319878 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.819861877 +0000 UTC m=+141.624760211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.333492 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" podStartSLOduration=123.333474436 podStartE2EDuration="2m3.333474436s" podCreationTimestamp="2025-12-03 06:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:36.332772261 +0000 UTC m=+141.137670605" watchObservedRunningTime="2025-12-03 06:47:36.333474436 +0000 UTC m=+141.138372771" Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.365231 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.365577 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.367044 4475 patch_prober.go:28] interesting pod/apiserver-76f77b778f-jf25k container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.22:8443/livez\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.367084 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" podUID="5f1ef448-9656-4d56-9d7e-d0992ec24085" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.22:8443/livez\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.422768 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.422901 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.922882467 +0000 UTC m=+141.727780802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.423017 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.423330 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:36.923321961 +0000 UTC m=+141.728220284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.524044 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.524145 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.02412738 +0000 UTC m=+141.829025713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.524501 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.524796 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.024785353 +0000 UTC m=+141.829683687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.625363 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.625595 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.125576354 +0000 UTC m=+141.930474689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.625688 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.625971 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.125960334 +0000 UTC m=+141.930858667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.726132 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.726298 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.226277057 +0000 UTC m=+142.031175391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.726522 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.726820 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.226799776 +0000 UTC m=+142.031698110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.827718 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.827849 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.327833163 +0000 UTC m=+142.132731497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.828106 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.828391 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.328382673 +0000 UTC m=+142.133281006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.862024 4475 patch_prober.go:28] interesting pod/router-default-5444994796-wkrx4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:47:36 crc kubenswrapper[4475]: [-]has-synced failed: reason withheld Dec 03 06:47:36 crc kubenswrapper[4475]: [+]process-running ok Dec 03 06:47:36 crc kubenswrapper[4475]: healthz check failed Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.862308 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrx4" podUID="21a4d7e9-ea88-4f43-9d43-109df1bb4766" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.904067 4475 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 06:47:36 crc kubenswrapper[4475]: I1203 06:47:36.929598 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:36 crc kubenswrapper[4475]: E1203 06:47:36.929979 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.429959357 +0000 UTC m=+142.234857691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.030535 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:37 crc kubenswrapper[4475]: E1203 06:47:37.030834 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.530824498 +0000 UTC m=+142.335722832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.131398 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:37 crc kubenswrapper[4475]: E1203 06:47:37.131773 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.631747547 +0000 UTC m=+142.436645881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.233050 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:37 crc kubenswrapper[4475]: E1203 06:47:37.233310 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.733300167 +0000 UTC m=+142.538198500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.267856 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-86l88" event={"ID":"ba21124e-33f1-4cf7-8bc2-483a5810191d","Type":"ContainerStarted","Data":"cd7de2b42bced5ecae58a140dfecaa96f7c9fe107334077d90f7aae89d524c72"} Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.269537 4475 patch_prober.go:28] interesting pod/downloads-7954f5f757-dbjfd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.269596 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dbjfd" podUID="c279294d-fe07-48d3-800d-7d73eba69c17" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.277547 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xsp8w"] Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.278249 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.283371 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.298822 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-86l88" podStartSLOduration=9.298806935 podStartE2EDuration="9.298806935s" podCreationTimestamp="2025-12-03 06:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:37.289094634 +0000 UTC m=+142.093992969" watchObservedRunningTime="2025-12-03 06:47:37.298806935 +0000 UTC m=+142.103705270" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.299620 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xsp8w"] Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.304524 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5cjz" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.334076 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:37 crc kubenswrapper[4475]: E1203 06:47:37.334167 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.834153344 +0000 UTC m=+142.639051678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.334276 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.334471 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-catalog-content\") pod \"community-operators-xsp8w\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.334550 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-utilities\") pod \"community-operators-xsp8w\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.334575 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t46l5\" (UniqueName: \"kubernetes.io/projected/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-kube-api-access-t46l5\") pod \"community-operators-xsp8w\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:47:37 crc kubenswrapper[4475]: E1203 06:47:37.335491 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.835483337 +0000 UTC m=+142.640381660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.435017 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:37 crc kubenswrapper[4475]: E1203 06:47:37.435148 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.935131567 +0000 UTC m=+142.740029901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.435233 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-utilities\") pod \"community-operators-xsp8w\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.435257 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t46l5\" (UniqueName: \"kubernetes.io/projected/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-kube-api-access-t46l5\") pod \"community-operators-xsp8w\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.435293 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.435371 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-catalog-content\") pod \"community-operators-xsp8w\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:47:37 crc kubenswrapper[4475]: E1203 06:47:37.435668 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:37.935656831 +0000 UTC m=+142.740555165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.435750 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-catalog-content\") pod \"community-operators-xsp8w\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.435816 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-utilities\") pod \"community-operators-xsp8w\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.436527 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.437012 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.438883 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.439144 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.445093 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.464066 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t46l5\" (UniqueName: \"kubernetes.io/projected/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-kube-api-access-t46l5\") pod \"community-operators-xsp8w\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.474741 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qkcc8"] Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.475484 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.478791 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.487088 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qkcc8"] Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.535799 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:37 crc kubenswrapper[4475]: E1203 06:47:37.535936 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:47:38.035920044 +0000 UTC m=+142.840818378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.536117 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s67lb\" (UniqueName: \"kubernetes.io/projected/30be3012-ac26-4a64-b650-66174f25549a-kube-api-access-s67lb\") pod \"certified-operators-qkcc8\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.536241 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:37 crc kubenswrapper[4475]: E1203 06:47:37.536519 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:47:38.036508587 +0000 UTC m=+142.841406921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dcjv5" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.536845 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-catalog-content\") pod \"certified-operators-qkcc8\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.536975 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-utilities\") pod \"certified-operators-qkcc8\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.537103 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd1b330e-1c1d-4525-9294-7be97dc1428b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd1b330e-1c1d-4525-9294-7be97dc1428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.537230 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd1b330e-1c1d-4525-9294-7be97dc1428b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd1b330e-1c1d-4525-9294-7be97dc1428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.590424 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.620241 4475 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T06:47:36.904086351Z","Handler":null,"Name":""} Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.624042 4475 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.624069 4475 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.637591 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.637748 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd1b330e-1c1d-4525-9294-7be97dc1428b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd1b330e-1c1d-4525-9294-7be97dc1428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.637780 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd1b330e-1c1d-4525-9294-7be97dc1428b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd1b330e-1c1d-4525-9294-7be97dc1428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.637818 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s67lb\" (UniqueName: \"kubernetes.io/projected/30be3012-ac26-4a64-b650-66174f25549a-kube-api-access-s67lb\") pod \"certified-operators-qkcc8\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.637874 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-catalog-content\") pod \"certified-operators-qkcc8\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.637897 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-utilities\") pod \"certified-operators-qkcc8\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.637990 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd1b330e-1c1d-4525-9294-7be97dc1428b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd1b330e-1c1d-4525-9294-7be97dc1428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.638215 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-utilities\") pod \"certified-operators-qkcc8\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.638358 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-catalog-content\") pod \"certified-operators-qkcc8\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.653993 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd1b330e-1c1d-4525-9294-7be97dc1428b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd1b330e-1c1d-4525-9294-7be97dc1428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.661972 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s67lb\" (UniqueName: \"kubernetes.io/projected/30be3012-ac26-4a64-b650-66174f25549a-kube-api-access-s67lb\") pod \"certified-operators-qkcc8\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.668023 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kp6xz"] Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.669111 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.685510 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.721479 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kp6xz"] Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.739092 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6nql\" (UniqueName: \"kubernetes.io/projected/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-kube-api-access-q6nql\") pod \"community-operators-kp6xz\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.739157 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-catalog-content\") pod \"community-operators-kp6xz\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.739322 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.739417 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-utilities\") pod \"community-operators-kp6xz\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.741777 4475 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.741810 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.747218 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.805940 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.809326 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dcjv5\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.838959 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xsp8w"] Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.840020 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-utilities\") pod \"community-operators-kp6xz\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.840065 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6nql\" (UniqueName: \"kubernetes.io/projected/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-kube-api-access-q6nql\") pod \"community-operators-kp6xz\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.840117 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-catalog-content\") pod \"community-operators-kp6xz\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.840535 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-catalog-content\") pod \"community-operators-kp6xz\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.840734 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-utilities\") pod \"community-operators-kp6xz\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:47:37 crc kubenswrapper[4475]: W1203 06:47:37.851764 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ab24e6_93f2_46dd_aace_3de3344bd9f1.slice/crio-3c41e97b4318fff415633c9ff33212f4dbac5a35e477860297daac166242b261 WatchSource:0}: Error finding container 3c41e97b4318fff415633c9ff33212f4dbac5a35e477860297daac166242b261: Status 404 returned error can't find the container with id 3c41e97b4318fff415633c9ff33212f4dbac5a35e477860297daac166242b261 Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.859464 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6nql\" (UniqueName: \"kubernetes.io/projected/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-kube-api-access-q6nql\") pod \"community-operators-kp6xz\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.865075 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s48nx"] Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.865115 4475 patch_prober.go:28] interesting pod/router-default-5444994796-wkrx4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:47:37 crc kubenswrapper[4475]: [-]has-synced failed: reason withheld Dec 03 06:47:37 crc kubenswrapper[4475]: [+]process-running ok Dec 03 06:47:37 crc kubenswrapper[4475]: healthz check failed Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.865146 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrx4" podUID="21a4d7e9-ea88-4f43-9d43-109df1bb4766" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.865914 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.876490 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s48nx"] Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.943098 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-catalog-content\") pod \"certified-operators-s48nx\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.943180 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-utilities\") pod \"certified-operators-s48nx\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:47:37 crc kubenswrapper[4475]: I1203 06:47:37.943206 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6wrm\" (UniqueName: \"kubernetes.io/projected/a8d946f2-23cd-4863-8105-06d4ea2e1205-kube-api-access-s6wrm\") pod \"certified-operators-s48nx\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.022804 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.032794 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.046891 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-catalog-content\") pod \"certified-operators-s48nx\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.046971 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-utilities\") pod \"certified-operators-s48nx\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.046995 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6wrm\" (UniqueName: \"kubernetes.io/projected/a8d946f2-23cd-4863-8105-06d4ea2e1205-kube-api-access-s6wrm\") pod \"certified-operators-s48nx\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.047698 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-catalog-content\") pod \"certified-operators-s48nx\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.047709 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-utilities\") pod \"certified-operators-s48nx\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.063927 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6wrm\" (UniqueName: \"kubernetes.io/projected/a8d946f2-23cd-4863-8105-06d4ea2e1205-kube-api-access-s6wrm\") pod \"certified-operators-s48nx\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.071837 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.155620 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qkcc8"] Dec 03 06:47:38 crc kubenswrapper[4475]: W1203 06:47:38.181366 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30be3012_ac26_4a64_b650_66174f25549a.slice/crio-eaeb08705422ed6be4da4d92c90bd6169de7c10bad17b34b840e5ea35063ef32 WatchSource:0}: Error finding container eaeb08705422ed6be4da4d92c90bd6169de7c10bad17b34b840e5ea35063ef32: Status 404 returned error can't find the container with id eaeb08705422ed6be4da4d92c90bd6169de7c10bad17b34b840e5ea35063ef32 Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.185659 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.293121 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkcc8" event={"ID":"30be3012-ac26-4a64-b650-66174f25549a","Type":"ContainerStarted","Data":"eaeb08705422ed6be4da4d92c90bd6169de7c10bad17b34b840e5ea35063ef32"} Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.295763 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd1b330e-1c1d-4525-9294-7be97dc1428b","Type":"ContainerStarted","Data":"b3aa7b5e2f28ebdd1158f55936c4877a38d5fcc8fe38a9e28457e314acfb7225"} Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.298194 4475 generic.go:334] "Generic (PLEG): container finished" podID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" containerID="83c685a6bd0714461f311975cbf6222bb99444a2045f4d57a89aeffd226bcac0" exitCode=0 Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.298332 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsp8w" event={"ID":"c6ab24e6-93f2-46dd-aace-3de3344bd9f1","Type":"ContainerDied","Data":"83c685a6bd0714461f311975cbf6222bb99444a2045f4d57a89aeffd226bcac0"} Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.298370 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsp8w" event={"ID":"c6ab24e6-93f2-46dd-aace-3de3344bd9f1","Type":"ContainerStarted","Data":"3c41e97b4318fff415633c9ff33212f4dbac5a35e477860297daac166242b261"} Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.308865 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.531365 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kp6xz"] Dec 03 06:47:38 crc kubenswrapper[4475]: W1203 06:47:38.536956 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1dbc1b_9217_4212_bbde_d79dd2ec15f6.slice/crio-6f6a294675626cb6bb2bca80f80f6fc943d11e30332b5d14924fd8de4ade3307 WatchSource:0}: Error finding container 6f6a294675626cb6bb2bca80f80f6fc943d11e30332b5d14924fd8de4ade3307: Status 404 returned error can't find the container with id 6f6a294675626cb6bb2bca80f80f6fc943d11e30332b5d14924fd8de4ade3307 Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.636628 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s48nx"] Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.660937 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dcjv5"] Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.861908 4475 patch_prober.go:28] interesting pod/router-default-5444994796-wkrx4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:47:38 crc kubenswrapper[4475]: [-]has-synced failed: reason withheld Dec 03 06:47:38 crc kubenswrapper[4475]: [+]process-running ok Dec 03 06:47:38 crc kubenswrapper[4475]: healthz check failed Dec 03 06:47:38 crc kubenswrapper[4475]: I1203 06:47:38.862076 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrx4" podUID="21a4d7e9-ea88-4f43-9d43-109df1bb4766" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.262059 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q8f5d"] Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.270650 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.279000 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.281358 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8f5d"] Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.314843 4475 generic.go:334] "Generic (PLEG): container finished" podID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" containerID="50ab3f7abf297c2f8c5a9a123c53490f7f5f222ed1651d7e49a83a35e17553ee" exitCode=0 Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.314921 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp6xz" event={"ID":"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6","Type":"ContainerDied","Data":"50ab3f7abf297c2f8c5a9a123c53490f7f5f222ed1651d7e49a83a35e17553ee"} Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.315039 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp6xz" event={"ID":"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6","Type":"ContainerStarted","Data":"6f6a294675626cb6bb2bca80f80f6fc943d11e30332b5d14924fd8de4ade3307"} Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.331015 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" event={"ID":"fe214ce1-0821-4547-ac8b-e001a0579495","Type":"ContainerStarted","Data":"709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10"} Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.331056 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" event={"ID":"fe214ce1-0821-4547-ac8b-e001a0579495","Type":"ContainerStarted","Data":"04d46d465e41ea7893dce520d9627958de1d845ff76685f4558b6580f429a803"} Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.331716 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.339196 4475 generic.go:334] "Generic (PLEG): container finished" podID="30be3012-ac26-4a64-b650-66174f25549a" containerID="6d2e38a2d61a5d81e356cfccb9f6914d9e1af1fb59d1da8fe547615a267d12af" exitCode=0 Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.339251 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkcc8" event={"ID":"30be3012-ac26-4a64-b650-66174f25549a","Type":"ContainerDied","Data":"6d2e38a2d61a5d81e356cfccb9f6914d9e1af1fb59d1da8fe547615a267d12af"} Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.347478 4475 generic.go:334] "Generic (PLEG): container finished" podID="a8d946f2-23cd-4863-8105-06d4ea2e1205" containerID="b983aabab7a5dfdec894132e31e79db5979955d1fb44d5f79ddc5d59490ef380" exitCode=0 Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.347646 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s48nx" event={"ID":"a8d946f2-23cd-4863-8105-06d4ea2e1205","Type":"ContainerDied","Data":"b983aabab7a5dfdec894132e31e79db5979955d1fb44d5f79ddc5d59490ef380"} Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.347683 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s48nx" event={"ID":"a8d946f2-23cd-4863-8105-06d4ea2e1205","Type":"ContainerStarted","Data":"f663600ec7a21d6b713e56b5206e7fb5bd7092e3e543192c6a705691f4c264d9"} Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.349943 4475 generic.go:334] "Generic (PLEG): container finished" podID="bd1b330e-1c1d-4525-9294-7be97dc1428b" containerID="8a4ee7885a951aae67394379898a4d497114268a1fbd1b492904b90a134b76ff" exitCode=0 Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.349979 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd1b330e-1c1d-4525-9294-7be97dc1428b","Type":"ContainerDied","Data":"8a4ee7885a951aae67394379898a4d497114268a1fbd1b492904b90a134b76ff"} Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.358390 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" podStartSLOduration=125.358365869 podStartE2EDuration="2m5.358365869s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:39.355398618 +0000 UTC m=+144.160296952" watchObservedRunningTime="2025-12-03 06:47:39.358365869 +0000 UTC m=+144.163264203" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.362331 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-catalog-content\") pod \"redhat-marketplace-q8f5d\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.362402 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-utilities\") pod \"redhat-marketplace-q8f5d\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.362563 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fx4\" (UniqueName: \"kubernetes.io/projected/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-kube-api-access-q8fx4\") pod \"redhat-marketplace-q8f5d\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.463268 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fx4\" (UniqueName: \"kubernetes.io/projected/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-kube-api-access-q8fx4\") pod \"redhat-marketplace-q8f5d\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.463318 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-catalog-content\") pod \"redhat-marketplace-q8f5d\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.463346 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-utilities\") pod \"redhat-marketplace-q8f5d\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.463728 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-utilities\") pod \"redhat-marketplace-q8f5d\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.463965 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-catalog-content\") pod \"redhat-marketplace-q8f5d\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.479252 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fx4\" (UniqueName: \"kubernetes.io/projected/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-kube-api-access-q8fx4\") pod \"redhat-marketplace-q8f5d\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.498923 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.599428 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.664416 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p2nkg"] Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.665570 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.672753 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2nkg"] Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.766617 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8k2w\" (UniqueName: \"kubernetes.io/projected/f4657474-771f-4f94-a2e3-a8262e7ce1b6-kube-api-access-m8k2w\") pod \"redhat-marketplace-p2nkg\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.766677 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-catalog-content\") pod \"redhat-marketplace-p2nkg\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.766730 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-utilities\") pod \"redhat-marketplace-p2nkg\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.777347 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8f5d"] Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.860812 4475 patch_prober.go:28] interesting pod/router-default-5444994796-wkrx4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:47:39 crc kubenswrapper[4475]: [-]has-synced failed: reason withheld Dec 03 06:47:39 crc kubenswrapper[4475]: [+]process-running ok Dec 03 06:47:39 crc kubenswrapper[4475]: healthz check failed Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.860856 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrx4" podUID="21a4d7e9-ea88-4f43-9d43-109df1bb4766" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.867327 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8k2w\" (UniqueName: \"kubernetes.io/projected/f4657474-771f-4f94-a2e3-a8262e7ce1b6-kube-api-access-m8k2w\") pod \"redhat-marketplace-p2nkg\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.867495 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-catalog-content\") pod \"redhat-marketplace-p2nkg\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.867661 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-utilities\") pod \"redhat-marketplace-p2nkg\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.868058 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-catalog-content\") pod \"redhat-marketplace-p2nkg\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.868201 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-utilities\") pod \"redhat-marketplace-p2nkg\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:47:39 crc kubenswrapper[4475]: I1203 06:47:39.881377 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8k2w\" (UniqueName: \"kubernetes.io/projected/f4657474-771f-4f94-a2e3-a8262e7ce1b6-kube-api-access-m8k2w\") pod \"redhat-marketplace-p2nkg\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.026772 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.272666 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.272704 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.272757 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.272778 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.274232 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.275864 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.276295 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.277779 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.300311 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.306804 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.310564 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.311580 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.312014 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.314764 4475 patch_prober.go:28] interesting pod/console-f9d7485db-dbxhk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.314897 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dbxhk" podUID="09928a8e-a70b-4916-9ae2-4dbe952aa514" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.364570 4475 generic.go:334] "Generic (PLEG): container finished" podID="83c4eef5-5508-470d-8b7a-b7da9d4706d4" containerID="64bbe628906ffd7c485ec8cc71ede08aea8875194cb357d96894ab844be9e9f5" exitCode=0 Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.364633 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" event={"ID":"83c4eef5-5508-470d-8b7a-b7da9d4706d4","Type":"ContainerDied","Data":"64bbe628906ffd7c485ec8cc71ede08aea8875194cb357d96894ab844be9e9f5"} Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.369875 4475 generic.go:334] "Generic (PLEG): container finished" podID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" containerID="1359b6dad0c2a395d812df37d975820ee838952a1570715f61f712b7761e7a9e" exitCode=0 Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.369938 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8f5d" event={"ID":"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2","Type":"ContainerDied","Data":"1359b6dad0c2a395d812df37d975820ee838952a1570715f61f712b7761e7a9e"} Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.370299 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8f5d" event={"ID":"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2","Type":"ContainerStarted","Data":"29ac8bcf2e100c2987b9604e671bfd38448d01af63b6105a0a6129c9df666a2d"} Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.405376 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2nkg"] Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.660809 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.668658 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tnqbz"] Dec 03 06:47:40 crc kubenswrapper[4475]: E1203 06:47:40.675518 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1b330e-1c1d-4525-9294-7be97dc1428b" containerName="pruner" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.675539 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1b330e-1c1d-4525-9294-7be97dc1428b" containerName="pruner" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.675645 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1b330e-1c1d-4525-9294-7be97dc1428b" containerName="pruner" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.676310 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.677466 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnqbz"] Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.677763 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd1b330e-1c1d-4525-9294-7be97dc1428b-kube-api-access\") pod \"bd1b330e-1c1d-4525-9294-7be97dc1428b\" (UID: \"bd1b330e-1c1d-4525-9294-7be97dc1428b\") " Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.677839 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd1b330e-1c1d-4525-9294-7be97dc1428b-kubelet-dir\") pod \"bd1b330e-1c1d-4525-9294-7be97dc1428b\" (UID: \"bd1b330e-1c1d-4525-9294-7be97dc1428b\") " Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.678096 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd1b330e-1c1d-4525-9294-7be97dc1428b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bd1b330e-1c1d-4525-9294-7be97dc1428b" (UID: "bd1b330e-1c1d-4525-9294-7be97dc1428b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.679484 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.689308 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1b330e-1c1d-4525-9294-7be97dc1428b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bd1b330e-1c1d-4525-9294-7be97dc1428b" (UID: "bd1b330e-1c1d-4525-9294-7be97dc1428b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.778934 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-utilities\") pod \"redhat-operators-tnqbz\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.779065 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhvzm\" (UniqueName: \"kubernetes.io/projected/d75bb35d-29b7-4994-9bc6-756f2950d3fd-kube-api-access-mhvzm\") pod \"redhat-operators-tnqbz\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.779091 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-catalog-content\") pod \"redhat-operators-tnqbz\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.779213 4475 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd1b330e-1c1d-4525-9294-7be97dc1428b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.779226 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd1b330e-1c1d-4525-9294-7be97dc1428b-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:40 crc kubenswrapper[4475]: W1203 06:47:40.846758 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-5f80a8ee837fcebbb640b117495e822410e75e153eaf953cb82f49a182645abb WatchSource:0}: Error finding container 5f80a8ee837fcebbb640b117495e822410e75e153eaf953cb82f49a182645abb: Status 404 returned error can't find the container with id 5f80a8ee837fcebbb640b117495e822410e75e153eaf953cb82f49a182645abb Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.858581 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.860372 4475 patch_prober.go:28] interesting pod/router-default-5444994796-wkrx4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:47:40 crc kubenswrapper[4475]: [-]has-synced failed: reason withheld Dec 03 06:47:40 crc kubenswrapper[4475]: [+]process-running ok Dec 03 06:47:40 crc kubenswrapper[4475]: healthz check failed Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.860420 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrx4" podUID="21a4d7e9-ea88-4f43-9d43-109df1bb4766" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.880890 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhvzm\" (UniqueName: \"kubernetes.io/projected/d75bb35d-29b7-4994-9bc6-756f2950d3fd-kube-api-access-mhvzm\") pod \"redhat-operators-tnqbz\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.880922 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-catalog-content\") pod \"redhat-operators-tnqbz\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.880994 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-utilities\") pod \"redhat-operators-tnqbz\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.882171 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-catalog-content\") pod \"redhat-operators-tnqbz\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.883123 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-utilities\") pod \"redhat-operators-tnqbz\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.895045 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhvzm\" (UniqueName: \"kubernetes.io/projected/d75bb35d-29b7-4994-9bc6-756f2950d3fd-kube-api-access-mhvzm\") pod \"redhat-operators-tnqbz\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:47:40 crc kubenswrapper[4475]: W1203 06:47:40.947055 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-93531bf859dc1a0bd39351d4fe90cfa5163604a4c62b051bbbec512670d0267a WatchSource:0}: Error finding container 93531bf859dc1a0bd39351d4fe90cfa5163604a4c62b051bbbec512670d0267a: Status 404 returned error can't find the container with id 93531bf859dc1a0bd39351d4fe90cfa5163604a4c62b051bbbec512670d0267a Dec 03 06:47:40 crc kubenswrapper[4475]: W1203 06:47:40.949083 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-6be7b3edbe5fdaefa067447911f30fb7d859bcf52911a24596aa8c4781573710 WatchSource:0}: Error finding container 6be7b3edbe5fdaefa067447911f30fb7d859bcf52911a24596aa8c4781573710: Status 404 returned error can't find the container with id 6be7b3edbe5fdaefa067447911f30fb7d859bcf52911a24596aa8c4781573710 Dec 03 06:47:40 crc kubenswrapper[4475]: I1203 06:47:40.990061 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.068073 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2lbjz"] Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.069394 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.075158 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lbjz"] Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.084271 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-catalog-content\") pod \"redhat-operators-2lbjz\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.084416 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-utilities\") pod \"redhat-operators-2lbjz\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.084576 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvfh\" (UniqueName: \"kubernetes.io/projected/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-kube-api-access-ptvfh\") pod \"redhat-operators-2lbjz\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.185996 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvfh\" (UniqueName: \"kubernetes.io/projected/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-kube-api-access-ptvfh\") pod \"redhat-operators-2lbjz\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.186178 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-catalog-content\") pod \"redhat-operators-2lbjz\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.186312 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-utilities\") pod \"redhat-operators-2lbjz\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.186823 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-catalog-content\") pod \"redhat-operators-2lbjz\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.187378 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-utilities\") pod \"redhat-operators-2lbjz\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.212554 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptvfh\" (UniqueName: \"kubernetes.io/projected/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-kube-api-access-ptvfh\") pod \"redhat-operators-2lbjz\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.311444 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnqbz"] Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.368766 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.374812 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jf25k" Dec 03 06:47:41 crc kubenswrapper[4475]: W1203 06:47:41.376663 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75bb35d_29b7_4994_9bc6_756f2950d3fd.slice/crio-382354460bf86b5d7d60696354b8158e09287df82f5b8cd328c22b074ac5530d WatchSource:0}: Error finding container 382354460bf86b5d7d60696354b8158e09287df82f5b8cd328c22b074ac5530d: Status 404 returned error can't find the container with id 382354460bf86b5d7d60696354b8158e09287df82f5b8cd328c22b074ac5530d Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.381192 4475 patch_prober.go:28] interesting pod/downloads-7954f5f757-dbjfd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.381228 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dbjfd" podUID="c279294d-fe07-48d3-800d-7d73eba69c17" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.381200 4475 patch_prober.go:28] interesting pod/downloads-7954f5f757-dbjfd container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.381266 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dbjfd" podUID="c279294d-fe07-48d3-800d-7d73eba69c17" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.401288 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.402889 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd1b330e-1c1d-4525-9294-7be97dc1428b","Type":"ContainerDied","Data":"b3aa7b5e2f28ebdd1158f55936c4877a38d5fcc8fe38a9e28457e314acfb7225"} Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.402917 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3aa7b5e2f28ebdd1158f55936c4877a38d5fcc8fe38a9e28457e314acfb7225" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.402912 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.412419 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c4275d2acfd9a918d98b13dffa354984edcb2329db557b34ca3bbb71ea16813c"} Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.412445 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6be7b3edbe5fdaefa067447911f30fb7d859bcf52911a24596aa8c4781573710"} Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.433125 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b9473e641057c93344142ceee1789b2dd3722c4b73b947065bb62935008c2ef4"} Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.433156 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"93531bf859dc1a0bd39351d4fe90cfa5163604a4c62b051bbbec512670d0267a"} Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.433595 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.454678 4475 generic.go:334] "Generic (PLEG): container finished" podID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" containerID="2bb385c5810b96d703e5a99ba00b92d4afb3177e2a3c59740ee915f8d74b9f34" exitCode=0 Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.455113 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2nkg" event={"ID":"f4657474-771f-4f94-a2e3-a8262e7ce1b6","Type":"ContainerDied","Data":"2bb385c5810b96d703e5a99ba00b92d4afb3177e2a3c59740ee915f8d74b9f34"} Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.455143 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2nkg" event={"ID":"f4657474-771f-4f94-a2e3-a8262e7ce1b6","Type":"ContainerStarted","Data":"cfbd8717518c0d98cc36004a7cda9fa242cec8aae6093819788064dc9897ff96"} Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.481618 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"772529b5f36350f1726cdce63cc042b3a8553f89c0fba72f468b66e5b9c270d4"} Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.481673 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5f80a8ee837fcebbb640b117495e822410e75e153eaf953cb82f49a182645abb"} Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.861578 4475 patch_prober.go:28] interesting pod/router-default-5444994796-wkrx4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:47:41 crc kubenswrapper[4475]: [-]has-synced failed: reason withheld Dec 03 06:47:41 crc kubenswrapper[4475]: [+]process-running ok Dec 03 06:47:41 crc kubenswrapper[4475]: healthz check failed Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.861800 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrx4" podUID="21a4d7e9-ea88-4f43-9d43-109df1bb4766" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:47:41 crc kubenswrapper[4475]: I1203 06:47:41.917600 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lbjz"] Dec 03 06:47:42 crc kubenswrapper[4475]: I1203 06:47:42.520089 4475 generic.go:334] "Generic (PLEG): container finished" podID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" containerID="829b0bb16a1b8e69d92eb455d12d02cc38946b63e5109556b2dbc8d04737bdff" exitCode=0 Dec 03 06:47:42 crc kubenswrapper[4475]: I1203 06:47:42.520767 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnqbz" event={"ID":"d75bb35d-29b7-4994-9bc6-756f2950d3fd","Type":"ContainerDied","Data":"829b0bb16a1b8e69d92eb455d12d02cc38946b63e5109556b2dbc8d04737bdff"} Dec 03 06:47:42 crc kubenswrapper[4475]: I1203 06:47:42.520791 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnqbz" event={"ID":"d75bb35d-29b7-4994-9bc6-756f2950d3fd","Type":"ContainerStarted","Data":"382354460bf86b5d7d60696354b8158e09287df82f5b8cd328c22b074ac5530d"} Dec 03 06:47:42 crc kubenswrapper[4475]: I1203 06:47:42.863066 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:42 crc kubenswrapper[4475]: I1203 06:47:42.866707 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wkrx4" Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.394692 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.395304 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.397489 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.398069 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.430151 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.430248 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.433795 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.530977 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.531049 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.531258 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.544992 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:47:43 crc kubenswrapper[4475]: I1203 06:47:43.730374 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:47:46 crc kubenswrapper[4475]: I1203 06:47:46.596428 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-685k2" Dec 03 06:47:47 crc kubenswrapper[4475]: W1203 06:47:47.914380 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ec591ee_4ce6_4e39_a38e_6f3d9dda0da5.slice/crio-19f0fc9d4d8fe91ce3cbde7e5750d2bce814ccb8e0f958f52b8773d8f79a93dc WatchSource:0}: Error finding container 19f0fc9d4d8fe91ce3cbde7e5750d2bce814ccb8e0f958f52b8773d8f79a93dc: Status 404 returned error can't find the container with id 19f0fc9d4d8fe91ce3cbde7e5750d2bce814ccb8e0f958f52b8773d8f79a93dc Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.018135 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.122342 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 06:47:48 crc kubenswrapper[4475]: W1203 06:47:48.158211 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9c73b77a_482f_4cfa_b3ed_149a5ed4f2d8.slice/crio-c0758fbdef702a5f240a5a6b8cbb7b521bbd0c2d97a5c1d437ea10a0e4072009 WatchSource:0}: Error finding container c0758fbdef702a5f240a5a6b8cbb7b521bbd0c2d97a5c1d437ea10a0e4072009: Status 404 returned error can't find the container with id c0758fbdef702a5f240a5a6b8cbb7b521bbd0c2d97a5c1d437ea10a0e4072009 Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.197438 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk6mb\" (UniqueName: \"kubernetes.io/projected/83c4eef5-5508-470d-8b7a-b7da9d4706d4-kube-api-access-pk6mb\") pod \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.197706 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83c4eef5-5508-470d-8b7a-b7da9d4706d4-config-volume\") pod \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.197833 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83c4eef5-5508-470d-8b7a-b7da9d4706d4-secret-volume\") pod \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\" (UID: \"83c4eef5-5508-470d-8b7a-b7da9d4706d4\") " Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.198207 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83c4eef5-5508-470d-8b7a-b7da9d4706d4-config-volume" (OuterVolumeSpecName: "config-volume") pod "83c4eef5-5508-470d-8b7a-b7da9d4706d4" (UID: "83c4eef5-5508-470d-8b7a-b7da9d4706d4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.199536 4475 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83c4eef5-5508-470d-8b7a-b7da9d4706d4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.203042 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83c4eef5-5508-470d-8b7a-b7da9d4706d4-kube-api-access-pk6mb" (OuterVolumeSpecName: "kube-api-access-pk6mb") pod "83c4eef5-5508-470d-8b7a-b7da9d4706d4" (UID: "83c4eef5-5508-470d-8b7a-b7da9d4706d4"). InnerVolumeSpecName "kube-api-access-pk6mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.203098 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83c4eef5-5508-470d-8b7a-b7da9d4706d4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "83c4eef5-5508-470d-8b7a-b7da9d4706d4" (UID: "83c4eef5-5508-470d-8b7a-b7da9d4706d4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.301347 4475 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83c4eef5-5508-470d-8b7a-b7da9d4706d4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.301374 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk6mb\" (UniqueName: \"kubernetes.io/projected/83c4eef5-5508-470d-8b7a-b7da9d4706d4-kube-api-access-pk6mb\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.575171 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8","Type":"ContainerStarted","Data":"924f68770d1a802163547a4e2ef290c615cba23b861028a613b85265c5ca7c32"} Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.575212 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8","Type":"ContainerStarted","Data":"c0758fbdef702a5f240a5a6b8cbb7b521bbd0c2d97a5c1d437ea10a0e4072009"} Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.577070 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" event={"ID":"83c4eef5-5508-470d-8b7a-b7da9d4706d4","Type":"ContainerDied","Data":"41072a08398a16116e8aeb3f8cae52459d8c4d629f86ee71a567460fc26aae85"} Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.577084 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n" Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.577097 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41072a08398a16116e8aeb3f8cae52459d8c4d629f86ee71a567460fc26aae85" Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.578530 4475 generic.go:334] "Generic (PLEG): container finished" podID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" containerID="8fbd55fafe187e3174fd74b26fed1bd1e87930f0faa2dbd2cfd66bcf9d2bf939" exitCode=0 Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.578563 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbjz" event={"ID":"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5","Type":"ContainerDied","Data":"8fbd55fafe187e3174fd74b26fed1bd1e87930f0faa2dbd2cfd66bcf9d2bf939"} Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.578584 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbjz" event={"ID":"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5","Type":"ContainerStarted","Data":"19f0fc9d4d8fe91ce3cbde7e5750d2bce814ccb8e0f958f52b8773d8f79a93dc"} Dec 03 06:47:48 crc kubenswrapper[4475]: I1203 06:47:48.590904 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=5.590885836 podStartE2EDuration="5.590885836s" podCreationTimestamp="2025-12-03 06:47:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:48.584504728 +0000 UTC m=+153.389403062" watchObservedRunningTime="2025-12-03 06:47:48.590885836 +0000 UTC m=+153.395784190" Dec 03 06:47:49 crc kubenswrapper[4475]: I1203 06:47:49.587118 4475 generic.go:334] "Generic (PLEG): container finished" podID="9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8" containerID="924f68770d1a802163547a4e2ef290c615cba23b861028a613b85265c5ca7c32" exitCode=0 Dec 03 06:47:49 crc kubenswrapper[4475]: I1203 06:47:49.587192 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8","Type":"ContainerDied","Data":"924f68770d1a802163547a4e2ef290c615cba23b861028a613b85265c5ca7c32"} Dec 03 06:47:50 crc kubenswrapper[4475]: I1203 06:47:50.312569 4475 patch_prober.go:28] interesting pod/console-f9d7485db-dbxhk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 03 06:47:50 crc kubenswrapper[4475]: I1203 06:47:50.312662 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dbxhk" podUID="09928a8e-a70b-4916-9ae2-4dbe952aa514" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 03 06:47:51 crc kubenswrapper[4475]: I1203 06:47:51.389661 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dbjfd" Dec 03 06:47:55 crc kubenswrapper[4475]: I1203 06:47:55.991222 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:55 crc kubenswrapper[4475]: I1203 06:47:55.995491 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e9dd470-572a-4396-9be7-1a37e3c48977-metrics-certs\") pod \"network-metrics-daemon-hq2rn\" (UID: \"7e9dd470-572a-4396-9be7-1a37e3c48977\") " pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:55 crc kubenswrapper[4475]: I1203 06:47:55.998443 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq2rn" Dec 03 06:47:56 crc kubenswrapper[4475]: I1203 06:47:56.545971 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:47:56 crc kubenswrapper[4475]: I1203 06:47:56.618566 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8","Type":"ContainerDied","Data":"c0758fbdef702a5f240a5a6b8cbb7b521bbd0c2d97a5c1d437ea10a0e4072009"} Dec 03 06:47:56 crc kubenswrapper[4475]: I1203 06:47:56.618598 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0758fbdef702a5f240a5a6b8cbb7b521bbd0c2d97a5c1d437ea10a0e4072009" Dec 03 06:47:56 crc kubenswrapper[4475]: I1203 06:47:56.618598 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:47:56 crc kubenswrapper[4475]: I1203 06:47:56.701429 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kubelet-dir\") pod \"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8\" (UID: \"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8\") " Dec 03 06:47:56 crc kubenswrapper[4475]: I1203 06:47:56.701676 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kube-api-access\") pod \"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8\" (UID: \"9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8\") " Dec 03 06:47:56 crc kubenswrapper[4475]: I1203 06:47:56.701531 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8" (UID: "9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:47:56 crc kubenswrapper[4475]: I1203 06:47:56.701918 4475 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:56 crc kubenswrapper[4475]: I1203 06:47:56.707131 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8" (UID: "9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:47:56 crc kubenswrapper[4475]: I1203 06:47:56.803508 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:58 crc kubenswrapper[4475]: I1203 06:47:58.077228 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:47:58 crc kubenswrapper[4475]: I1203 06:47:58.933559 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:47:58 crc kubenswrapper[4475]: I1203 06:47:58.933610 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:48:00 crc kubenswrapper[4475]: I1203 06:48:00.315215 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:48:00 crc kubenswrapper[4475]: I1203 06:48:00.320350 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:48:01 crc kubenswrapper[4475]: I1203 06:48:01.946802 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hq2rn"] Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.647681 4475 generic.go:334] "Generic (PLEG): container finished" podID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" containerID="5594cd78f198d426469940e1aad26bfe7a94ebcce6b5b719d156c88379107ebd" exitCode=0 Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.647751 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp6xz" event={"ID":"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6","Type":"ContainerDied","Data":"5594cd78f198d426469940e1aad26bfe7a94ebcce6b5b719d156c88379107ebd"} Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.649573 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" event={"ID":"7e9dd470-572a-4396-9be7-1a37e3c48977","Type":"ContainerStarted","Data":"5ae0ebabcd63aa19a00119c11998c7f469fa6cb700af37782f4cdfebe835ebb5"} Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.649615 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" event={"ID":"7e9dd470-572a-4396-9be7-1a37e3c48977","Type":"ContainerStarted","Data":"8d40eead47ae8af3ca5b944bc49c12c40e636ad536c618ba623f170d4d61964b"} Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.649626 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hq2rn" event={"ID":"7e9dd470-572a-4396-9be7-1a37e3c48977","Type":"ContainerStarted","Data":"c87fc5bd18d6101e7e8955070eca83c3acec430e5f92d12ae19401c2ec59a285"} Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.651664 4475 generic.go:334] "Generic (PLEG): container finished" podID="a8d946f2-23cd-4863-8105-06d4ea2e1205" containerID="0b59791d280c73ec1e2d0f64516474311f3135d3748673906d325610e352bbeb" exitCode=0 Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.651701 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s48nx" event={"ID":"a8d946f2-23cd-4863-8105-06d4ea2e1205","Type":"ContainerDied","Data":"0b59791d280c73ec1e2d0f64516474311f3135d3748673906d325610e352bbeb"} Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.656267 4475 generic.go:334] "Generic (PLEG): container finished" podID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" containerID="6f66b0d47611424cad29587d4f111520801a1cac0b3352c2099eead7a78c35f2" exitCode=0 Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.656333 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbjz" event={"ID":"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5","Type":"ContainerDied","Data":"6f66b0d47611424cad29587d4f111520801a1cac0b3352c2099eead7a78c35f2"} Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.663001 4475 generic.go:334] "Generic (PLEG): container finished" podID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" containerID="ff0efc5306a44c6d5fe6fd24034d75680d489ca6a41761a6f4233651b7558ac5" exitCode=0 Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.663231 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8f5d" event={"ID":"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2","Type":"ContainerDied","Data":"ff0efc5306a44c6d5fe6fd24034d75680d489ca6a41761a6f4233651b7558ac5"} Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.666731 4475 generic.go:334] "Generic (PLEG): container finished" podID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" containerID="bc032f563674d26479c49ddf885f413a7145e13c0cd90e23d98184c5000db102" exitCode=0 Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.666786 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsp8w" event={"ID":"c6ab24e6-93f2-46dd-aace-3de3344bd9f1","Type":"ContainerDied","Data":"bc032f563674d26479c49ddf885f413a7145e13c0cd90e23d98184c5000db102"} Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.675595 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hq2rn" podStartSLOduration=148.67558371 podStartE2EDuration="2m28.67558371s" podCreationTimestamp="2025-12-03 06:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:48:02.670661267 +0000 UTC m=+167.475559601" watchObservedRunningTime="2025-12-03 06:48:02.67558371 +0000 UTC m=+167.480482045" Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.675721 4475 generic.go:334] "Generic (PLEG): container finished" podID="30be3012-ac26-4a64-b650-66174f25549a" containerID="9ad61fff554a0d681e164ec5f089f8c1fc3b00b84c4dc35c0bfc134a4c03a1c1" exitCode=0 Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.675767 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkcc8" event={"ID":"30be3012-ac26-4a64-b650-66174f25549a","Type":"ContainerDied","Data":"9ad61fff554a0d681e164ec5f089f8c1fc3b00b84c4dc35c0bfc134a4c03a1c1"} Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.681592 4475 generic.go:334] "Generic (PLEG): container finished" podID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" containerID="df9e540befd0b1960aa998db8253a8f0e2ded361e5e3177531a13be09d8b6a49" exitCode=0 Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.681643 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnqbz" event={"ID":"d75bb35d-29b7-4994-9bc6-756f2950d3fd","Type":"ContainerDied","Data":"df9e540befd0b1960aa998db8253a8f0e2ded361e5e3177531a13be09d8b6a49"} Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.689614 4475 generic.go:334] "Generic (PLEG): container finished" podID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" containerID="5a7a6e30abfd7603cddf3f7637a4115b4ea84db98b903d9e3e4c45704d0221c5" exitCode=0 Dec 03 06:48:02 crc kubenswrapper[4475]: I1203 06:48:02.689639 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2nkg" event={"ID":"f4657474-771f-4f94-a2e3-a8262e7ce1b6","Type":"ContainerDied","Data":"5a7a6e30abfd7603cddf3f7637a4115b4ea84db98b903d9e3e4c45704d0221c5"} Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.696081 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsp8w" event={"ID":"c6ab24e6-93f2-46dd-aace-3de3344bd9f1","Type":"ContainerStarted","Data":"7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8"} Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.698441 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkcc8" event={"ID":"30be3012-ac26-4a64-b650-66174f25549a","Type":"ContainerStarted","Data":"16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002"} Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.700489 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnqbz" event={"ID":"d75bb35d-29b7-4994-9bc6-756f2950d3fd","Type":"ContainerStarted","Data":"3e244226b044d928f00d8f988546d19f2c95eb36fe6bbf9807f072f51a9fde21"} Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.702196 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2nkg" event={"ID":"f4657474-771f-4f94-a2e3-a8262e7ce1b6","Type":"ContainerStarted","Data":"ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b"} Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.703552 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s48nx" event={"ID":"a8d946f2-23cd-4863-8105-06d4ea2e1205","Type":"ContainerStarted","Data":"3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718"} Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.705077 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp6xz" event={"ID":"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6","Type":"ContainerStarted","Data":"d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128"} Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.706798 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbjz" event={"ID":"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5","Type":"ContainerStarted","Data":"ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce"} Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.710234 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8f5d" event={"ID":"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2","Type":"ContainerStarted","Data":"68b043e0c00f78036582cc13706337b074ca4719eb6d07684ef40e9184cba405"} Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.715174 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xsp8w" podStartSLOduration=1.7759269199999999 podStartE2EDuration="26.715165842s" podCreationTimestamp="2025-12-03 06:47:37 +0000 UTC" firstStartedPulling="2025-12-03 06:47:38.308472303 +0000 UTC m=+143.113370638" lastFinishedPulling="2025-12-03 06:48:03.247711225 +0000 UTC m=+168.052609560" observedRunningTime="2025-12-03 06:48:03.713074815 +0000 UTC m=+168.517973149" watchObservedRunningTime="2025-12-03 06:48:03.715165842 +0000 UTC m=+168.520064177" Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.730191 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p2nkg" podStartSLOduration=2.871539675 podStartE2EDuration="24.730182384s" podCreationTimestamp="2025-12-03 06:47:39 +0000 UTC" firstStartedPulling="2025-12-03 06:47:41.512624419 +0000 UTC m=+146.317522753" lastFinishedPulling="2025-12-03 06:48:03.371267128 +0000 UTC m=+168.176165462" observedRunningTime="2025-12-03 06:48:03.727884017 +0000 UTC m=+168.532782351" watchObservedRunningTime="2025-12-03 06:48:03.730182384 +0000 UTC m=+168.535080718" Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.745082 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tnqbz" podStartSLOduration=8.139021775 podStartE2EDuration="23.745073199s" podCreationTimestamp="2025-12-03 06:47:40 +0000 UTC" firstStartedPulling="2025-12-03 06:47:47.567667419 +0000 UTC m=+152.372565744" lastFinishedPulling="2025-12-03 06:48:03.173718834 +0000 UTC m=+167.978617168" observedRunningTime="2025-12-03 06:48:03.744363439 +0000 UTC m=+168.549261773" watchObservedRunningTime="2025-12-03 06:48:03.745073199 +0000 UTC m=+168.549971533" Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.760825 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kp6xz" podStartSLOduration=2.902955727 podStartE2EDuration="26.760818035s" podCreationTimestamp="2025-12-03 06:47:37 +0000 UTC" firstStartedPulling="2025-12-03 06:47:39.316300699 +0000 UTC m=+144.121199033" lastFinishedPulling="2025-12-03 06:48:03.174163006 +0000 UTC m=+167.979061341" observedRunningTime="2025-12-03 06:48:03.758802968 +0000 UTC m=+168.563701303" watchObservedRunningTime="2025-12-03 06:48:03.760818035 +0000 UTC m=+168.565716369" Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.777123 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s48nx" podStartSLOduration=2.984230144 podStartE2EDuration="26.777107772s" podCreationTimestamp="2025-12-03 06:47:37 +0000 UTC" firstStartedPulling="2025-12-03 06:47:39.348856348 +0000 UTC m=+144.153754681" lastFinishedPulling="2025-12-03 06:48:03.141733985 +0000 UTC m=+167.946632309" observedRunningTime="2025-12-03 06:48:03.775998784 +0000 UTC m=+168.580897118" watchObservedRunningTime="2025-12-03 06:48:03.777107772 +0000 UTC m=+168.582006105" Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.819837 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2lbjz" podStartSLOduration=8.193335734 podStartE2EDuration="22.819821257s" podCreationTimestamp="2025-12-03 06:47:41 +0000 UTC" firstStartedPulling="2025-12-03 06:47:48.581069761 +0000 UTC m=+153.385968095" lastFinishedPulling="2025-12-03 06:48:03.207555283 +0000 UTC m=+168.012453618" observedRunningTime="2025-12-03 06:48:03.819020316 +0000 UTC m=+168.623918650" watchObservedRunningTime="2025-12-03 06:48:03.819821257 +0000 UTC m=+168.624719591" Dec 03 06:48:03 crc kubenswrapper[4475]: I1203 06:48:03.821257 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qkcc8" podStartSLOduration=2.923977673 podStartE2EDuration="26.821251086s" podCreationTimestamp="2025-12-03 06:47:37 +0000 UTC" firstStartedPulling="2025-12-03 06:47:39.3408797 +0000 UTC m=+144.145778034" lastFinishedPulling="2025-12-03 06:48:03.238153113 +0000 UTC m=+168.043051447" observedRunningTime="2025-12-03 06:48:03.800260049 +0000 UTC m=+168.605158383" watchObservedRunningTime="2025-12-03 06:48:03.821251086 +0000 UTC m=+168.626149420" Dec 03 06:48:07 crc kubenswrapper[4475]: I1203 06:48:07.592198 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:48:07 crc kubenswrapper[4475]: I1203 06:48:07.592410 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:48:07 crc kubenswrapper[4475]: I1203 06:48:07.746898 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:48:07 crc kubenswrapper[4475]: I1203 06:48:07.760403 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q8f5d" podStartSLOduration=5.9617213079999996 podStartE2EDuration="28.760390985s" podCreationTimestamp="2025-12-03 06:47:39 +0000 UTC" firstStartedPulling="2025-12-03 06:47:40.37212076 +0000 UTC m=+145.177019094" lastFinishedPulling="2025-12-03 06:48:03.170790436 +0000 UTC m=+167.975688771" observedRunningTime="2025-12-03 06:48:03.841003432 +0000 UTC m=+168.645901765" watchObservedRunningTime="2025-12-03 06:48:07.760390985 +0000 UTC m=+172.565289319" Dec 03 06:48:07 crc kubenswrapper[4475]: I1203 06:48:07.807491 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:48:07 crc kubenswrapper[4475]: I1203 06:48:07.807538 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:48:07 crc kubenswrapper[4475]: I1203 06:48:07.832230 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:48:08 crc kubenswrapper[4475]: I1203 06:48:08.024708 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:48:08 crc kubenswrapper[4475]: I1203 06:48:08.024752 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:48:08 crc kubenswrapper[4475]: I1203 06:48:08.050579 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:48:08 crc kubenswrapper[4475]: I1203 06:48:08.186906 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:48:08 crc kubenswrapper[4475]: I1203 06:48:08.187589 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:48:08 crc kubenswrapper[4475]: I1203 06:48:08.212736 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:48:08 crc kubenswrapper[4475]: I1203 06:48:08.756224 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:48:08 crc kubenswrapper[4475]: I1203 06:48:08.756306 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:48:08 crc kubenswrapper[4475]: I1203 06:48:08.756904 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:48:09 crc kubenswrapper[4475]: I1203 06:48:09.600044 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:48:09 crc kubenswrapper[4475]: I1203 06:48:09.600255 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:48:09 crc kubenswrapper[4475]: I1203 06:48:09.626433 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:48:09 crc kubenswrapper[4475]: I1203 06:48:09.668491 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kp6xz"] Dec 03 06:48:09 crc kubenswrapper[4475]: I1203 06:48:09.759009 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:48:10 crc kubenswrapper[4475]: I1203 06:48:10.027569 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:48:10 crc kubenswrapper[4475]: I1203 06:48:10.027608 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:48:10 crc kubenswrapper[4475]: I1203 06:48:10.054551 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:48:10 crc kubenswrapper[4475]: I1203 06:48:10.267679 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s48nx"] Dec 03 06:48:10 crc kubenswrapper[4475]: I1203 06:48:10.743018 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kp6xz" podUID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" containerName="registry-server" containerID="cri-o://d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128" gracePeriod=2 Dec 03 06:48:10 crc kubenswrapper[4475]: I1203 06:48:10.743850 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s48nx" podUID="a8d946f2-23cd-4863-8105-06d4ea2e1205" containerName="registry-server" containerID="cri-o://3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718" gracePeriod=2 Dec 03 06:48:10 crc kubenswrapper[4475]: I1203 06:48:10.778928 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:48:10 crc kubenswrapper[4475]: I1203 06:48:10.991052 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:48:10 crc kubenswrapper[4475]: I1203 06:48:10.991084 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.029665 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.166699 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.201813 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.266705 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-catalog-content\") pod \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.266800 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-utilities\") pod \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.266876 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6nql\" (UniqueName: \"kubernetes.io/projected/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-kube-api-access-q6nql\") pod \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\" (UID: \"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6\") " Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.267293 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-utilities" (OuterVolumeSpecName: "utilities") pod "5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" (UID: "5d1dbc1b-9217-4212-bbde-d79dd2ec15f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.267793 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.270889 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-kube-api-access-q6nql" (OuterVolumeSpecName: "kube-api-access-q6nql") pod "5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" (UID: "5d1dbc1b-9217-4212-bbde-d79dd2ec15f6"). InnerVolumeSpecName "kube-api-access-q6nql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.301676 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" (UID: "5d1dbc1b-9217-4212-bbde-d79dd2ec15f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.368145 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-catalog-content\") pod \"a8d946f2-23cd-4863-8105-06d4ea2e1205\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.368220 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6wrm\" (UniqueName: \"kubernetes.io/projected/a8d946f2-23cd-4863-8105-06d4ea2e1205-kube-api-access-s6wrm\") pod \"a8d946f2-23cd-4863-8105-06d4ea2e1205\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.368255 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-utilities\") pod \"a8d946f2-23cd-4863-8105-06d4ea2e1205\" (UID: \"a8d946f2-23cd-4863-8105-06d4ea2e1205\") " Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.368505 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.368521 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6nql\" (UniqueName: \"kubernetes.io/projected/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6-kube-api-access-q6nql\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.368843 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-utilities" (OuterVolumeSpecName: "utilities") pod "a8d946f2-23cd-4863-8105-06d4ea2e1205" (UID: "a8d946f2-23cd-4863-8105-06d4ea2e1205"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.370371 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d946f2-23cd-4863-8105-06d4ea2e1205-kube-api-access-s6wrm" (OuterVolumeSpecName: "kube-api-access-s6wrm") pod "a8d946f2-23cd-4863-8105-06d4ea2e1205" (UID: "a8d946f2-23cd-4863-8105-06d4ea2e1205"). InnerVolumeSpecName "kube-api-access-s6wrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.402316 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.402944 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.405417 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8d946f2-23cd-4863-8105-06d4ea2e1205" (UID: "a8d946f2-23cd-4863-8105-06d4ea2e1205"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.429334 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.470167 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6wrm\" (UniqueName: \"kubernetes.io/projected/a8d946f2-23cd-4863-8105-06d4ea2e1205-kube-api-access-s6wrm\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.470189 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.470199 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8d946f2-23cd-4863-8105-06d4ea2e1205-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.520626 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6bm4d" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.749166 4475 generic.go:334] "Generic (PLEG): container finished" podID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" containerID="d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128" exitCode=0 Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.749211 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp6xz" event={"ID":"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6","Type":"ContainerDied","Data":"d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128"} Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.749233 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp6xz" event={"ID":"5d1dbc1b-9217-4212-bbde-d79dd2ec15f6","Type":"ContainerDied","Data":"6f6a294675626cb6bb2bca80f80f6fc943d11e30332b5d14924fd8de4ade3307"} Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.749248 4475 scope.go:117] "RemoveContainer" containerID="d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.749324 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kp6xz" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.752674 4475 generic.go:334] "Generic (PLEG): container finished" podID="a8d946f2-23cd-4863-8105-06d4ea2e1205" containerID="3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718" exitCode=0 Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.752725 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s48nx" event={"ID":"a8d946f2-23cd-4863-8105-06d4ea2e1205","Type":"ContainerDied","Data":"3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718"} Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.752757 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s48nx" event={"ID":"a8d946f2-23cd-4863-8105-06d4ea2e1205","Type":"ContainerDied","Data":"f663600ec7a21d6b713e56b5206e7fb5bd7092e3e543192c6a705691f4c264d9"} Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.752757 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s48nx" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.764265 4475 scope.go:117] "RemoveContainer" containerID="5594cd78f198d426469940e1aad26bfe7a94ebcce6b5b719d156c88379107ebd" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.767429 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kp6xz"] Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.771405 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kp6xz"] Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.777063 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s48nx"] Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.778991 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s48nx"] Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.779537 4475 scope.go:117] "RemoveContainer" containerID="50ab3f7abf297c2f8c5a9a123c53490f7f5f222ed1651d7e49a83a35e17553ee" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.786867 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.789645 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.793655 4475 scope.go:117] "RemoveContainer" containerID="d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128" Dec 03 06:48:11 crc kubenswrapper[4475]: E1203 06:48:11.794100 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128\": container with ID starting with d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128 not found: ID does not exist" containerID="d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.794137 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128"} err="failed to get container status \"d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128\": rpc error: code = NotFound desc = could not find container \"d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128\": container with ID starting with d4d46ae2f3ca7408bd658103f4eabc1b33cfae316503192936fafad488c70128 not found: ID does not exist" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.794176 4475 scope.go:117] "RemoveContainer" containerID="5594cd78f198d426469940e1aad26bfe7a94ebcce6b5b719d156c88379107ebd" Dec 03 06:48:11 crc kubenswrapper[4475]: E1203 06:48:11.794896 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5594cd78f198d426469940e1aad26bfe7a94ebcce6b5b719d156c88379107ebd\": container with ID starting with 5594cd78f198d426469940e1aad26bfe7a94ebcce6b5b719d156c88379107ebd not found: ID does not exist" containerID="5594cd78f198d426469940e1aad26bfe7a94ebcce6b5b719d156c88379107ebd" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.794924 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5594cd78f198d426469940e1aad26bfe7a94ebcce6b5b719d156c88379107ebd"} err="failed to get container status \"5594cd78f198d426469940e1aad26bfe7a94ebcce6b5b719d156c88379107ebd\": rpc error: code = NotFound desc = could not find container \"5594cd78f198d426469940e1aad26bfe7a94ebcce6b5b719d156c88379107ebd\": container with ID starting with 5594cd78f198d426469940e1aad26bfe7a94ebcce6b5b719d156c88379107ebd not found: ID does not exist" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.794942 4475 scope.go:117] "RemoveContainer" containerID="50ab3f7abf297c2f8c5a9a123c53490f7f5f222ed1651d7e49a83a35e17553ee" Dec 03 06:48:11 crc kubenswrapper[4475]: E1203 06:48:11.795143 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ab3f7abf297c2f8c5a9a123c53490f7f5f222ed1651d7e49a83a35e17553ee\": container with ID starting with 50ab3f7abf297c2f8c5a9a123c53490f7f5f222ed1651d7e49a83a35e17553ee not found: ID does not exist" containerID="50ab3f7abf297c2f8c5a9a123c53490f7f5f222ed1651d7e49a83a35e17553ee" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.795161 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ab3f7abf297c2f8c5a9a123c53490f7f5f222ed1651d7e49a83a35e17553ee"} err="failed to get container status \"50ab3f7abf297c2f8c5a9a123c53490f7f5f222ed1651d7e49a83a35e17553ee\": rpc error: code = NotFound desc = could not find container \"50ab3f7abf297c2f8c5a9a123c53490f7f5f222ed1651d7e49a83a35e17553ee\": container with ID starting with 50ab3f7abf297c2f8c5a9a123c53490f7f5f222ed1651d7e49a83a35e17553ee not found: ID does not exist" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.795173 4475 scope.go:117] "RemoveContainer" containerID="3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.813664 4475 scope.go:117] "RemoveContainer" containerID="0b59791d280c73ec1e2d0f64516474311f3135d3748673906d325610e352bbeb" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.826080 4475 scope.go:117] "RemoveContainer" containerID="b983aabab7a5dfdec894132e31e79db5979955d1fb44d5f79ddc5d59490ef380" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.840221 4475 scope.go:117] "RemoveContainer" containerID="3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718" Dec 03 06:48:11 crc kubenswrapper[4475]: E1203 06:48:11.840592 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718\": container with ID starting with 3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718 not found: ID does not exist" containerID="3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.840616 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718"} err="failed to get container status \"3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718\": rpc error: code = NotFound desc = could not find container \"3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718\": container with ID starting with 3e69a7eed1b20726d97f39a2d9ccbe6274cf3ef468240661860e77916f251718 not found: ID does not exist" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.840632 4475 scope.go:117] "RemoveContainer" containerID="0b59791d280c73ec1e2d0f64516474311f3135d3748673906d325610e352bbeb" Dec 03 06:48:11 crc kubenswrapper[4475]: E1203 06:48:11.840879 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b59791d280c73ec1e2d0f64516474311f3135d3748673906d325610e352bbeb\": container with ID starting with 0b59791d280c73ec1e2d0f64516474311f3135d3748673906d325610e352bbeb not found: ID does not exist" containerID="0b59791d280c73ec1e2d0f64516474311f3135d3748673906d325610e352bbeb" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.840922 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b59791d280c73ec1e2d0f64516474311f3135d3748673906d325610e352bbeb"} err="failed to get container status \"0b59791d280c73ec1e2d0f64516474311f3135d3748673906d325610e352bbeb\": rpc error: code = NotFound desc = could not find container \"0b59791d280c73ec1e2d0f64516474311f3135d3748673906d325610e352bbeb\": container with ID starting with 0b59791d280c73ec1e2d0f64516474311f3135d3748673906d325610e352bbeb not found: ID does not exist" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.840945 4475 scope.go:117] "RemoveContainer" containerID="b983aabab7a5dfdec894132e31e79db5979955d1fb44d5f79ddc5d59490ef380" Dec 03 06:48:11 crc kubenswrapper[4475]: E1203 06:48:11.841251 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b983aabab7a5dfdec894132e31e79db5979955d1fb44d5f79ddc5d59490ef380\": container with ID starting with b983aabab7a5dfdec894132e31e79db5979955d1fb44d5f79ddc5d59490ef380 not found: ID does not exist" containerID="b983aabab7a5dfdec894132e31e79db5979955d1fb44d5f79ddc5d59490ef380" Dec 03 06:48:11 crc kubenswrapper[4475]: I1203 06:48:11.841303 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b983aabab7a5dfdec894132e31e79db5979955d1fb44d5f79ddc5d59490ef380"} err="failed to get container status \"b983aabab7a5dfdec894132e31e79db5979955d1fb44d5f79ddc5d59490ef380\": rpc error: code = NotFound desc = could not find container \"b983aabab7a5dfdec894132e31e79db5979955d1fb44d5f79ddc5d59490ef380\": container with ID starting with b983aabab7a5dfdec894132e31e79db5979955d1fb44d5f79ddc5d59490ef380 not found: ID does not exist" Dec 03 06:48:12 crc kubenswrapper[4475]: I1203 06:48:12.066776 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2nkg"] Dec 03 06:48:12 crc kubenswrapper[4475]: I1203 06:48:12.758821 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p2nkg" podUID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" containerName="registry-server" containerID="cri-o://ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b" gracePeriod=2 Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.147406 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.191089 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8k2w\" (UniqueName: \"kubernetes.io/projected/f4657474-771f-4f94-a2e3-a8262e7ce1b6-kube-api-access-m8k2w\") pod \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.191138 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-utilities\") pod \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.191158 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-catalog-content\") pod \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\" (UID: \"f4657474-771f-4f94-a2e3-a8262e7ce1b6\") " Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.191829 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-utilities" (OuterVolumeSpecName: "utilities") pod "f4657474-771f-4f94-a2e3-a8262e7ce1b6" (UID: "f4657474-771f-4f94-a2e3-a8262e7ce1b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.195033 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4657474-771f-4f94-a2e3-a8262e7ce1b6-kube-api-access-m8k2w" (OuterVolumeSpecName: "kube-api-access-m8k2w") pod "f4657474-771f-4f94-a2e3-a8262e7ce1b6" (UID: "f4657474-771f-4f94-a2e3-a8262e7ce1b6"). InnerVolumeSpecName "kube-api-access-m8k2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.204348 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4657474-771f-4f94-a2e3-a8262e7ce1b6" (UID: "f4657474-771f-4f94-a2e3-a8262e7ce1b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.291991 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.292018 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4657474-771f-4f94-a2e3-a8262e7ce1b6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.292031 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8k2w\" (UniqueName: \"kubernetes.io/projected/f4657474-771f-4f94-a2e3-a8262e7ce1b6-kube-api-access-m8k2w\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.503855 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" path="/var/lib/kubelet/pods/5d1dbc1b-9217-4212-bbde-d79dd2ec15f6/volumes" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.504461 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d946f2-23cd-4863-8105-06d4ea2e1205" path="/var/lib/kubelet/pods/a8d946f2-23cd-4863-8105-06d4ea2e1205/volumes" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.763544 4475 generic.go:334] "Generic (PLEG): container finished" podID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" containerID="ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b" exitCode=0 Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.763577 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2nkg" event={"ID":"f4657474-771f-4f94-a2e3-a8262e7ce1b6","Type":"ContainerDied","Data":"ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b"} Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.763616 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2nkg" event={"ID":"f4657474-771f-4f94-a2e3-a8262e7ce1b6","Type":"ContainerDied","Data":"cfbd8717518c0d98cc36004a7cda9fa242cec8aae6093819788064dc9897ff96"} Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.763631 4475 scope.go:117] "RemoveContainer" containerID="ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.763936 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2nkg" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.778586 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2nkg"] Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.781014 4475 scope.go:117] "RemoveContainer" containerID="5a7a6e30abfd7603cddf3f7637a4115b4ea84db98b903d9e3e4c45704d0221c5" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.784061 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2nkg"] Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.796418 4475 scope.go:117] "RemoveContainer" containerID="2bb385c5810b96d703e5a99ba00b92d4afb3177e2a3c59740ee915f8d74b9f34" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.809306 4475 scope.go:117] "RemoveContainer" containerID="ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b" Dec 03 06:48:13 crc kubenswrapper[4475]: E1203 06:48:13.809625 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b\": container with ID starting with ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b not found: ID does not exist" containerID="ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.809663 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b"} err="failed to get container status \"ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b\": rpc error: code = NotFound desc = could not find container \"ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b\": container with ID starting with ad030b1cea4b9f6b7469b77419887176b0febedb4ba941505623b42ec24cf00b not found: ID does not exist" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.809700 4475 scope.go:117] "RemoveContainer" containerID="5a7a6e30abfd7603cddf3f7637a4115b4ea84db98b903d9e3e4c45704d0221c5" Dec 03 06:48:13 crc kubenswrapper[4475]: E1203 06:48:13.809900 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a7a6e30abfd7603cddf3f7637a4115b4ea84db98b903d9e3e4c45704d0221c5\": container with ID starting with 5a7a6e30abfd7603cddf3f7637a4115b4ea84db98b903d9e3e4c45704d0221c5 not found: ID does not exist" containerID="5a7a6e30abfd7603cddf3f7637a4115b4ea84db98b903d9e3e4c45704d0221c5" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.809923 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7a6e30abfd7603cddf3f7637a4115b4ea84db98b903d9e3e4c45704d0221c5"} err="failed to get container status \"5a7a6e30abfd7603cddf3f7637a4115b4ea84db98b903d9e3e4c45704d0221c5\": rpc error: code = NotFound desc = could not find container \"5a7a6e30abfd7603cddf3f7637a4115b4ea84db98b903d9e3e4c45704d0221c5\": container with ID starting with 5a7a6e30abfd7603cddf3f7637a4115b4ea84db98b903d9e3e4c45704d0221c5 not found: ID does not exist" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.809937 4475 scope.go:117] "RemoveContainer" containerID="2bb385c5810b96d703e5a99ba00b92d4afb3177e2a3c59740ee915f8d74b9f34" Dec 03 06:48:13 crc kubenswrapper[4475]: E1203 06:48:13.810134 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bb385c5810b96d703e5a99ba00b92d4afb3177e2a3c59740ee915f8d74b9f34\": container with ID starting with 2bb385c5810b96d703e5a99ba00b92d4afb3177e2a3c59740ee915f8d74b9f34 not found: ID does not exist" containerID="2bb385c5810b96d703e5a99ba00b92d4afb3177e2a3c59740ee915f8d74b9f34" Dec 03 06:48:13 crc kubenswrapper[4475]: I1203 06:48:13.810155 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bb385c5810b96d703e5a99ba00b92d4afb3177e2a3c59740ee915f8d74b9f34"} err="failed to get container status \"2bb385c5810b96d703e5a99ba00b92d4afb3177e2a3c59740ee915f8d74b9f34\": rpc error: code = NotFound desc = could not find container \"2bb385c5810b96d703e5a99ba00b92d4afb3177e2a3c59740ee915f8d74b9f34\": container with ID starting with 2bb385c5810b96d703e5a99ba00b92d4afb3177e2a3c59740ee915f8d74b9f34 not found: ID does not exist" Dec 03 06:48:14 crc kubenswrapper[4475]: I1203 06:48:14.669481 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lbjz"] Dec 03 06:48:14 crc kubenswrapper[4475]: I1203 06:48:14.768294 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2lbjz" podUID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" containerName="registry-server" containerID="cri-o://ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce" gracePeriod=2 Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.284634 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.413972 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-utilities\") pod \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.414060 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptvfh\" (UniqueName: \"kubernetes.io/projected/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-kube-api-access-ptvfh\") pod \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.414140 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-catalog-content\") pod \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\" (UID: \"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5\") " Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.414511 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-utilities" (OuterVolumeSpecName: "utilities") pod "6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" (UID: "6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.420543 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-kube-api-access-ptvfh" (OuterVolumeSpecName: "kube-api-access-ptvfh") pod "6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" (UID: "6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5"). InnerVolumeSpecName "kube-api-access-ptvfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.512356 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" path="/var/lib/kubelet/pods/f4657474-771f-4f94-a2e3-a8262e7ce1b6/volumes" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.515277 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.515300 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptvfh\" (UniqueName: \"kubernetes.io/projected/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-kube-api-access-ptvfh\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.516934 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" (UID: "6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.616170 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.782305 4475 generic.go:334] "Generic (PLEG): container finished" podID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" containerID="ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce" exitCode=0 Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.782401 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lbjz" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.782392 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbjz" event={"ID":"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5","Type":"ContainerDied","Data":"ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce"} Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.783189 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbjz" event={"ID":"6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5","Type":"ContainerDied","Data":"19f0fc9d4d8fe91ce3cbde7e5750d2bce814ccb8e0f958f52b8773d8f79a93dc"} Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.783210 4475 scope.go:117] "RemoveContainer" containerID="ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.800515 4475 scope.go:117] "RemoveContainer" containerID="6f66b0d47611424cad29587d4f111520801a1cac0b3352c2099eead7a78c35f2" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.806647 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lbjz"] Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.813355 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2lbjz"] Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.820436 4475 scope.go:117] "RemoveContainer" containerID="8fbd55fafe187e3174fd74b26fed1bd1e87930f0faa2dbd2cfd66bcf9d2bf939" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.830116 4475 scope.go:117] "RemoveContainer" containerID="ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce" Dec 03 06:48:15 crc kubenswrapper[4475]: E1203 06:48:15.830439 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce\": container with ID starting with ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce not found: ID does not exist" containerID="ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.830624 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce"} err="failed to get container status \"ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce\": rpc error: code = NotFound desc = could not find container \"ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce\": container with ID starting with ea60e28296136ce161a463dcb50340c6783ae15c4180e5fae205c341c55bc8ce not found: ID does not exist" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.830942 4475 scope.go:117] "RemoveContainer" containerID="6f66b0d47611424cad29587d4f111520801a1cac0b3352c2099eead7a78c35f2" Dec 03 06:48:15 crc kubenswrapper[4475]: E1203 06:48:15.831424 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f66b0d47611424cad29587d4f111520801a1cac0b3352c2099eead7a78c35f2\": container with ID starting with 6f66b0d47611424cad29587d4f111520801a1cac0b3352c2099eead7a78c35f2 not found: ID does not exist" containerID="6f66b0d47611424cad29587d4f111520801a1cac0b3352c2099eead7a78c35f2" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.831467 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f66b0d47611424cad29587d4f111520801a1cac0b3352c2099eead7a78c35f2"} err="failed to get container status \"6f66b0d47611424cad29587d4f111520801a1cac0b3352c2099eead7a78c35f2\": rpc error: code = NotFound desc = could not find container \"6f66b0d47611424cad29587d4f111520801a1cac0b3352c2099eead7a78c35f2\": container with ID starting with 6f66b0d47611424cad29587d4f111520801a1cac0b3352c2099eead7a78c35f2 not found: ID does not exist" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.831486 4475 scope.go:117] "RemoveContainer" containerID="8fbd55fafe187e3174fd74b26fed1bd1e87930f0faa2dbd2cfd66bcf9d2bf939" Dec 03 06:48:15 crc kubenswrapper[4475]: E1203 06:48:15.831827 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fbd55fafe187e3174fd74b26fed1bd1e87930f0faa2dbd2cfd66bcf9d2bf939\": container with ID starting with 8fbd55fafe187e3174fd74b26fed1bd1e87930f0faa2dbd2cfd66bcf9d2bf939 not found: ID does not exist" containerID="8fbd55fafe187e3174fd74b26fed1bd1e87930f0faa2dbd2cfd66bcf9d2bf939" Dec 03 06:48:15 crc kubenswrapper[4475]: I1203 06:48:15.831910 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fbd55fafe187e3174fd74b26fed1bd1e87930f0faa2dbd2cfd66bcf9d2bf939"} err="failed to get container status \"8fbd55fafe187e3174fd74b26fed1bd1e87930f0faa2dbd2cfd66bcf9d2bf939\": rpc error: code = NotFound desc = could not find container \"8fbd55fafe187e3174fd74b26fed1bd1e87930f0faa2dbd2cfd66bcf9d2bf939\": container with ID starting with 8fbd55fafe187e3174fd74b26fed1bd1e87930f0faa2dbd2cfd66bcf9d2bf939 not found: ID does not exist" Dec 03 06:48:17 crc kubenswrapper[4475]: I1203 06:48:17.496424 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" path="/var/lib/kubelet/pods/6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5/volumes" Dec 03 06:48:17 crc kubenswrapper[4475]: I1203 06:48:17.617718 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.390819 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.391652 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d946f2-23cd-4863-8105-06d4ea2e1205" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.391728 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d946f2-23cd-4863-8105-06d4ea2e1205" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.391786 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" containerName="extract-utilities" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.391838 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" containerName="extract-utilities" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.391891 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d946f2-23cd-4863-8105-06d4ea2e1205" containerName="extract-utilities" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.391937 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d946f2-23cd-4863-8105-06d4ea2e1205" containerName="extract-utilities" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.391983 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" containerName="extract-utilities" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.392027 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" containerName="extract-utilities" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.392095 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.392139 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.392192 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" containerName="extract-content" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.392242 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" containerName="extract-content" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.392292 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.392335 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.392385 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" containerName="extract-content" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.392434 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" containerName="extract-content" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.392527 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" containerName="extract-content" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.392577 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" containerName="extract-content" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.392635 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" containerName="extract-utilities" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.392678 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" containerName="extract-utilities" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.392731 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.392775 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.392825 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c4eef5-5508-470d-8b7a-b7da9d4706d4" containerName="collect-profiles" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.392877 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c4eef5-5508-470d-8b7a-b7da9d4706d4" containerName="collect-profiles" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.392928 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8" containerName="pruner" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.392977 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8" containerName="pruner" Dec 03 06:48:19 crc kubenswrapper[4475]: E1203 06:48:19.393026 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d946f2-23cd-4863-8105-06d4ea2e1205" containerName="extract-content" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.393078 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d946f2-23cd-4863-8105-06d4ea2e1205" containerName="extract-content" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.393212 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1dbc1b-9217-4212-bbde-d79dd2ec15f6" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.393277 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="83c4eef5-5508-470d-8b7a-b7da9d4706d4" containerName="collect-profiles" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.393330 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4657474-771f-4f94-a2e3-a8262e7ce1b6" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.393386 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c73b77a-482f-4cfa-b3ed-149a5ed4f2d8" containerName="pruner" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.393438 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec591ee-4ce6-4e39-a38e-6f3d9dda0da5" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.393523 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d946f2-23cd-4863-8105-06d4ea2e1205" containerName="registry-server" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.393864 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.395766 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.397081 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.404766 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.424039 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r542"] Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.459479 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d920241-43ec-44a6-8221-29b43924aab0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2d920241-43ec-44a6-8221-29b43924aab0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.459629 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d920241-43ec-44a6-8221-29b43924aab0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2d920241-43ec-44a6-8221-29b43924aab0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.560265 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d920241-43ec-44a6-8221-29b43924aab0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2d920241-43ec-44a6-8221-29b43924aab0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.560317 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d920241-43ec-44a6-8221-29b43924aab0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2d920241-43ec-44a6-8221-29b43924aab0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.560793 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d920241-43ec-44a6-8221-29b43924aab0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2d920241-43ec-44a6-8221-29b43924aab0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.575665 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d920241-43ec-44a6-8221-29b43924aab0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2d920241-43ec-44a6-8221-29b43924aab0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:48:19 crc kubenswrapper[4475]: I1203 06:48:19.705841 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:48:20 crc kubenswrapper[4475]: I1203 06:48:20.026229 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 06:48:20 crc kubenswrapper[4475]: W1203 06:48:20.031840 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2d920241_43ec_44a6_8221_29b43924aab0.slice/crio-3383db6ce113ec9330945c3e7523400ffda7cb5fa39080ea7b1de1ee9267dcab WatchSource:0}: Error finding container 3383db6ce113ec9330945c3e7523400ffda7cb5fa39080ea7b1de1ee9267dcab: Status 404 returned error can't find the container with id 3383db6ce113ec9330945c3e7523400ffda7cb5fa39080ea7b1de1ee9267dcab Dec 03 06:48:20 crc kubenswrapper[4475]: I1203 06:48:20.314913 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:48:20 crc kubenswrapper[4475]: I1203 06:48:20.803832 4475 generic.go:334] "Generic (PLEG): container finished" podID="2d920241-43ec-44a6-8221-29b43924aab0" containerID="5afca8061d571c040ebafbf884f46101056d4eebdf019e9f02a63a789578b82f" exitCode=0 Dec 03 06:48:20 crc kubenswrapper[4475]: I1203 06:48:20.803862 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2d920241-43ec-44a6-8221-29b43924aab0","Type":"ContainerDied","Data":"5afca8061d571c040ebafbf884f46101056d4eebdf019e9f02a63a789578b82f"} Dec 03 06:48:20 crc kubenswrapper[4475]: I1203 06:48:20.803884 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2d920241-43ec-44a6-8221-29b43924aab0","Type":"ContainerStarted","Data":"3383db6ce113ec9330945c3e7523400ffda7cb5fa39080ea7b1de1ee9267dcab"} Dec 03 06:48:22 crc kubenswrapper[4475]: I1203 06:48:22.041496 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:48:22 crc kubenswrapper[4475]: I1203 06:48:22.088213 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d920241-43ec-44a6-8221-29b43924aab0-kube-api-access\") pod \"2d920241-43ec-44a6-8221-29b43924aab0\" (UID: \"2d920241-43ec-44a6-8221-29b43924aab0\") " Dec 03 06:48:22 crc kubenswrapper[4475]: I1203 06:48:22.088266 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d920241-43ec-44a6-8221-29b43924aab0-kubelet-dir\") pod \"2d920241-43ec-44a6-8221-29b43924aab0\" (UID: \"2d920241-43ec-44a6-8221-29b43924aab0\") " Dec 03 06:48:22 crc kubenswrapper[4475]: I1203 06:48:22.088425 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d920241-43ec-44a6-8221-29b43924aab0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2d920241-43ec-44a6-8221-29b43924aab0" (UID: "2d920241-43ec-44a6-8221-29b43924aab0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:48:22 crc kubenswrapper[4475]: I1203 06:48:22.088587 4475 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d920241-43ec-44a6-8221-29b43924aab0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:22 crc kubenswrapper[4475]: I1203 06:48:22.092246 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d920241-43ec-44a6-8221-29b43924aab0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2d920241-43ec-44a6-8221-29b43924aab0" (UID: "2d920241-43ec-44a6-8221-29b43924aab0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:48:22 crc kubenswrapper[4475]: I1203 06:48:22.189424 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d920241-43ec-44a6-8221-29b43924aab0-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:22 crc kubenswrapper[4475]: I1203 06:48:22.811697 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2d920241-43ec-44a6-8221-29b43924aab0","Type":"ContainerDied","Data":"3383db6ce113ec9330945c3e7523400ffda7cb5fa39080ea7b1de1ee9267dcab"} Dec 03 06:48:22 crc kubenswrapper[4475]: I1203 06:48:22.811738 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3383db6ce113ec9330945c3e7523400ffda7cb5fa39080ea7b1de1ee9267dcab" Dec 03 06:48:22 crc kubenswrapper[4475]: I1203 06:48:22.811751 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.590009 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 06:48:25 crc kubenswrapper[4475]: E1203 06:48:25.590299 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d920241-43ec-44a6-8221-29b43924aab0" containerName="pruner" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.590310 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d920241-43ec-44a6-8221-29b43924aab0" containerName="pruner" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.590397 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d920241-43ec-44a6-8221-29b43924aab0" containerName="pruner" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.590684 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.593941 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.594151 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.601978 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.620350 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333f027c-627f-4107-93a2-f522a583a5ed-kube-api-access\") pod \"installer-9-crc\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.620400 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-var-lock\") pod \"installer-9-crc\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.620427 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-kubelet-dir\") pod \"installer-9-crc\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.720932 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333f027c-627f-4107-93a2-f522a583a5ed-kube-api-access\") pod \"installer-9-crc\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.720978 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-var-lock\") pod \"installer-9-crc\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.721004 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-kubelet-dir\") pod \"installer-9-crc\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.721067 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-kubelet-dir\") pod \"installer-9-crc\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.721086 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-var-lock\") pod \"installer-9-crc\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.735305 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333f027c-627f-4107-93a2-f522a583a5ed-kube-api-access\") pod \"installer-9-crc\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:48:25 crc kubenswrapper[4475]: I1203 06:48:25.905510 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:48:26 crc kubenswrapper[4475]: I1203 06:48:26.230035 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 06:48:26 crc kubenswrapper[4475]: I1203 06:48:26.830721 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"333f027c-627f-4107-93a2-f522a583a5ed","Type":"ContainerStarted","Data":"2af3c7a09ed5c49c67d4fdc0b8d8a5e2eb11d1683871ab83696f0c6b3a36e157"} Dec 03 06:48:26 crc kubenswrapper[4475]: I1203 06:48:26.830899 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"333f027c-627f-4107-93a2-f522a583a5ed","Type":"ContainerStarted","Data":"ad69205d75c1fed21d9ade1b846da0d22c2cfe3d8c8f5fd3bdf6815f06f39914"} Dec 03 06:48:26 crc kubenswrapper[4475]: I1203 06:48:26.842232 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.842222967 podStartE2EDuration="1.842222967s" podCreationTimestamp="2025-12-03 06:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:48:26.841160044 +0000 UTC m=+191.646058378" watchObservedRunningTime="2025-12-03 06:48:26.842222967 +0000 UTC m=+191.647121291" Dec 03 06:48:28 crc kubenswrapper[4475]: I1203 06:48:28.933866 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:48:28 crc kubenswrapper[4475]: I1203 06:48:28.934084 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.449767 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" podUID="d52a94b2-a290-48af-b060-5f3662029280" containerName="oauth-openshift" containerID="cri-o://cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0" gracePeriod=15 Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.734160 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.755169 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8475d84ddf-v62mp"] Dec 03 06:48:44 crc kubenswrapper[4475]: E1203 06:48:44.755338 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d52a94b2-a290-48af-b060-5f3662029280" containerName="oauth-openshift" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.755368 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52a94b2-a290-48af-b060-5f3662029280" containerName="oauth-openshift" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.755488 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="d52a94b2-a290-48af-b060-5f3662029280" containerName="oauth-openshift" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.755783 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.770066 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8475d84ddf-v62mp"] Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.798951 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.799107 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-audit-policies\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.799221 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d0b9083d-0ce3-4310-93a3-091aea4e519c-audit-dir\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.799325 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-session\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.799435 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.799561 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.799680 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wpnj\" (UniqueName: \"kubernetes.io/projected/d0b9083d-0ce3-4310-93a3-091aea4e519c-kube-api-access-7wpnj\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.799775 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-service-ca\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.799867 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.799971 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-router-certs\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.800072 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.800168 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.800258 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-template-error\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.800367 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-template-login\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.891433 4475 generic.go:334] "Generic (PLEG): container finished" podID="d52a94b2-a290-48af-b060-5f3662029280" containerID="cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0" exitCode=0 Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.891496 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" event={"ID":"d52a94b2-a290-48af-b060-5f3662029280","Type":"ContainerDied","Data":"cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0"} Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.891521 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" event={"ID":"d52a94b2-a290-48af-b060-5f3662029280","Type":"ContainerDied","Data":"63edc3f11007b4cf7c102de13b7c35f159d6d6a8f90da408e1e05ab6353abcb1"} Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.891536 4475 scope.go:117] "RemoveContainer" containerID="cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.891621 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6r542" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901432 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-provider-selection\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901486 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-cliconfig\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901508 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-audit-policies\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901530 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-idp-0-file-data\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901547 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx6ft\" (UniqueName: \"kubernetes.io/projected/d52a94b2-a290-48af-b060-5f3662029280-kube-api-access-rx6ft\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901563 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-router-certs\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901584 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-serving-cert\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901599 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-error\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901613 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d52a94b2-a290-48af-b060-5f3662029280-audit-dir\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901634 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-session\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901649 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-login\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901674 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-trusted-ca-bundle\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901697 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-service-ca\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901715 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-ocp-branding-template\") pod \"d52a94b2-a290-48af-b060-5f3662029280\" (UID: \"d52a94b2-a290-48af-b060-5f3662029280\") " Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901813 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-router-certs\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.901856 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d52a94b2-a290-48af-b060-5f3662029280-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902142 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902321 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902362 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902381 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-template-error\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902404 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-template-login\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902434 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902511 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-audit-policies\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902531 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d0b9083d-0ce3-4310-93a3-091aea4e519c-audit-dir\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902546 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-session\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902564 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902578 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902600 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wpnj\" (UniqueName: \"kubernetes.io/projected/d0b9083d-0ce3-4310-93a3-091aea4e519c-kube-api-access-7wpnj\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902618 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-service-ca\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902634 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902791 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.902777 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.903669 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.904010 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.905684 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.905783 4475 scope.go:117] "RemoveContainer" containerID="cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.905941 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: E1203 06:48:44.906126 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0\": container with ID starting with cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0 not found: ID does not exist" containerID="cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.906268 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0"} err="failed to get container status \"cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0\": rpc error: code = NotFound desc = could not find container \"cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0\": container with ID starting with cef3dd2f3d6520eb005a3b365aeea38701ccdbe3533be1f4a38acb48d13996b0 not found: ID does not exist" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.906187 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.906440 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.906957 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.907507 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.907764 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.907817 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d0b9083d-0ce3-4310-93a3-091aea4e519c-audit-dir\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.908155 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52a94b2-a290-48af-b060-5f3662029280-kube-api-access-rx6ft" (OuterVolumeSpecName: "kube-api-access-rx6ft") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "kube-api-access-rx6ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.908211 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-audit-policies\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.908382 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d52a94b2-a290-48af-b060-5f3662029280" (UID: "d52a94b2-a290-48af-b060-5f3662029280"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.908937 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.909280 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.909471 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-service-ca\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.909911 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.910276 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-router-certs\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.910286 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.911939 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-system-session\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.913795 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-template-login\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.915426 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-template-error\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.915819 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d0b9083d-0ce3-4310-93a3-091aea4e519c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:44 crc kubenswrapper[4475]: I1203 06:48:44.920965 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wpnj\" (UniqueName: \"kubernetes.io/projected/d0b9083d-0ce3-4310-93a3-091aea4e519c-kube-api-access-7wpnj\") pod \"oauth-openshift-8475d84ddf-v62mp\" (UID: \"d0b9083d-0ce3-4310-93a3-091aea4e519c\") " pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003493 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003525 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003536 4475 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003546 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003555 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx6ft\" (UniqueName: \"kubernetes.io/projected/d52a94b2-a290-48af-b060-5f3662029280-kube-api-access-rx6ft\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003564 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003572 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003582 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003591 4475 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d52a94b2-a290-48af-b060-5f3662029280-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003598 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003606 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003614 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003626 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.003634 4475 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d52a94b2-a290-48af-b060-5f3662029280-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.078824 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.213931 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r542"] Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.217689 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r542"] Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.402986 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8475d84ddf-v62mp"] Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.496925 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d52a94b2-a290-48af-b060-5f3662029280" path="/var/lib/kubelet/pods/d52a94b2-a290-48af-b060-5f3662029280/volumes" Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.896812 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" event={"ID":"d0b9083d-0ce3-4310-93a3-091aea4e519c","Type":"ContainerStarted","Data":"1896a308b563bb4328b5ed3359f7b72733c133efcf60ddc0bb08d7444e4e7112"} Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.896846 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" event={"ID":"d0b9083d-0ce3-4310-93a3-091aea4e519c","Type":"ContainerStarted","Data":"79e51c22a0e7c0041eca26136ce45fcb4ec67f79c15838c3fc2dbd98c048170d"} Dec 03 06:48:45 crc kubenswrapper[4475]: I1203 06:48:45.912728 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" podStartSLOduration=26.912713938 podStartE2EDuration="26.912713938s" podCreationTimestamp="2025-12-03 06:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:48:45.909928834 +0000 UTC m=+210.714827169" watchObservedRunningTime="2025-12-03 06:48:45.912713938 +0000 UTC m=+210.717612272" Dec 03 06:48:46 crc kubenswrapper[4475]: I1203 06:48:46.901204 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:46 crc kubenswrapper[4475]: I1203 06:48:46.905233 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8475d84ddf-v62mp" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.486769 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qkcc8"] Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.487333 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qkcc8" podUID="30be3012-ac26-4a64-b650-66174f25549a" containerName="registry-server" containerID="cri-o://16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002" gracePeriod=30 Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.500131 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xsp8w"] Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.500412 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xsp8w" podUID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" containerName="registry-server" containerID="cri-o://7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8" gracePeriod=30 Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.506203 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kfvwc"] Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.506364 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" podUID="7f6163e8-ce0d-481b-8483-4b9e04d381e6" containerName="marketplace-operator" containerID="cri-o://2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc" gracePeriod=30 Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.515850 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8f5d"] Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.516044 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q8f5d" podUID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" containerName="registry-server" containerID="cri-o://68b043e0c00f78036582cc13706337b074ca4719eb6d07684ef40e9184cba405" gracePeriod=30 Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.526856 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8xfd"] Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.527426 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.528441 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnqbz"] Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.528632 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tnqbz" podUID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" containerName="registry-server" containerID="cri-o://3e244226b044d928f00d8f988546d19f2c95eb36fe6bbf9807f072f51a9fde21" gracePeriod=30 Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.537543 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8xfd"] Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.705112 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72e92f7d-5ec5-43d3-b81a-df695e7adbde-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8xfd\" (UID: \"72e92f7d-5ec5-43d3-b81a-df695e7adbde\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.705304 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/72e92f7d-5ec5-43d3-b81a-df695e7adbde-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8xfd\" (UID: \"72e92f7d-5ec5-43d3-b81a-df695e7adbde\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.705342 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84r7h\" (UniqueName: \"kubernetes.io/projected/72e92f7d-5ec5-43d3-b81a-df695e7adbde-kube-api-access-84r7h\") pod \"marketplace-operator-79b997595-p8xfd\" (UID: \"72e92f7d-5ec5-43d3-b81a-df695e7adbde\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.805741 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72e92f7d-5ec5-43d3-b81a-df695e7adbde-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8xfd\" (UID: \"72e92f7d-5ec5-43d3-b81a-df695e7adbde\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.805796 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/72e92f7d-5ec5-43d3-b81a-df695e7adbde-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8xfd\" (UID: \"72e92f7d-5ec5-43d3-b81a-df695e7adbde\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.805822 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84r7h\" (UniqueName: \"kubernetes.io/projected/72e92f7d-5ec5-43d3-b81a-df695e7adbde-kube-api-access-84r7h\") pod \"marketplace-operator-79b997595-p8xfd\" (UID: \"72e92f7d-5ec5-43d3-b81a-df695e7adbde\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.807437 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72e92f7d-5ec5-43d3-b81a-df695e7adbde-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8xfd\" (UID: \"72e92f7d-5ec5-43d3-b81a-df695e7adbde\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.812348 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/72e92f7d-5ec5-43d3-b81a-df695e7adbde-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8xfd\" (UID: \"72e92f7d-5ec5-43d3-b81a-df695e7adbde\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.819566 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84r7h\" (UniqueName: \"kubernetes.io/projected/72e92f7d-5ec5-43d3-b81a-df695e7adbde-kube-api-access-84r7h\") pod \"marketplace-operator-79b997595-p8xfd\" (UID: \"72e92f7d-5ec5-43d3-b81a-df695e7adbde\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.838675 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.848145 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.881105 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.901118 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.910655 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-catalog-content\") pod \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.910700 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-utilities\") pod \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.910720 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-utilities\") pod \"30be3012-ac26-4a64-b650-66174f25549a\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.910741 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s67lb\" (UniqueName: \"kubernetes.io/projected/30be3012-ac26-4a64-b650-66174f25549a-kube-api-access-s67lb\") pod \"30be3012-ac26-4a64-b650-66174f25549a\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.910800 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrvnc\" (UniqueName: \"kubernetes.io/projected/7f6163e8-ce0d-481b-8483-4b9e04d381e6-kube-api-access-xrvnc\") pod \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.910821 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t46l5\" (UniqueName: \"kubernetes.io/projected/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-kube-api-access-t46l5\") pod \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\" (UID: \"c6ab24e6-93f2-46dd-aace-3de3344bd9f1\") " Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.910841 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-catalog-content\") pod \"30be3012-ac26-4a64-b650-66174f25549a\" (UID: \"30be3012-ac26-4a64-b650-66174f25549a\") " Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.910885 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-operator-metrics\") pod \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.910909 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-trusted-ca\") pod \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\" (UID: \"7f6163e8-ce0d-481b-8483-4b9e04d381e6\") " Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.912099 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "7f6163e8-ce0d-481b-8483-4b9e04d381e6" (UID: "7f6163e8-ce0d-481b-8483-4b9e04d381e6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.912823 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-utilities" (OuterVolumeSpecName: "utilities") pod "c6ab24e6-93f2-46dd-aace-3de3344bd9f1" (UID: "c6ab24e6-93f2-46dd-aace-3de3344bd9f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.915491 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30be3012-ac26-4a64-b650-66174f25549a-kube-api-access-s67lb" (OuterVolumeSpecName: "kube-api-access-s67lb") pod "30be3012-ac26-4a64-b650-66174f25549a" (UID: "30be3012-ac26-4a64-b650-66174f25549a"). InnerVolumeSpecName "kube-api-access-s67lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.916046 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6163e8-ce0d-481b-8483-4b9e04d381e6-kube-api-access-xrvnc" (OuterVolumeSpecName: "kube-api-access-xrvnc") pod "7f6163e8-ce0d-481b-8483-4b9e04d381e6" (UID: "7f6163e8-ce0d-481b-8483-4b9e04d381e6"). InnerVolumeSpecName "kube-api-access-xrvnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.917183 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-utilities" (OuterVolumeSpecName: "utilities") pod "30be3012-ac26-4a64-b650-66174f25549a" (UID: "30be3012-ac26-4a64-b650-66174f25549a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.917727 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-kube-api-access-t46l5" (OuterVolumeSpecName: "kube-api-access-t46l5") pod "c6ab24e6-93f2-46dd-aace-3de3344bd9f1" (UID: "c6ab24e6-93f2-46dd-aace-3de3344bd9f1"). InnerVolumeSpecName "kube-api-access-t46l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.922420 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "7f6163e8-ce0d-481b-8483-4b9e04d381e6" (UID: "7f6163e8-ce0d-481b-8483-4b9e04d381e6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.948297 4475 generic.go:334] "Generic (PLEG): container finished" podID="7f6163e8-ce0d-481b-8483-4b9e04d381e6" containerID="2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc" exitCode=0 Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.949170 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" event={"ID":"7f6163e8-ce0d-481b-8483-4b9e04d381e6","Type":"ContainerDied","Data":"2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc"} Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.949200 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" event={"ID":"7f6163e8-ce0d-481b-8483-4b9e04d381e6","Type":"ContainerDied","Data":"ca2780568ea913491e5769586ae8e06b6ef8522ad3cde3c33c744bccf05aedb7"} Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.949766 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kfvwc" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.949217 4475 scope.go:117] "RemoveContainer" containerID="2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.959182 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30be3012-ac26-4a64-b650-66174f25549a" (UID: "30be3012-ac26-4a64-b650-66174f25549a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.977580 4475 generic.go:334] "Generic (PLEG): container finished" podID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" containerID="68b043e0c00f78036582cc13706337b074ca4719eb6d07684ef40e9184cba405" exitCode=0 Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.977753 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8f5d" event={"ID":"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2","Type":"ContainerDied","Data":"68b043e0c00f78036582cc13706337b074ca4719eb6d07684ef40e9184cba405"} Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.983591 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.984701 4475 generic.go:334] "Generic (PLEG): container finished" podID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" containerID="7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8" exitCode=0 Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.984751 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsp8w" event={"ID":"c6ab24e6-93f2-46dd-aace-3de3344bd9f1","Type":"ContainerDied","Data":"7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8"} Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.984789 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsp8w" event={"ID":"c6ab24e6-93f2-46dd-aace-3de3344bd9f1","Type":"ContainerDied","Data":"3c41e97b4318fff415633c9ff33212f4dbac5a35e477860297daac166242b261"} Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.984815 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsp8w" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.987632 4475 scope.go:117] "RemoveContainer" containerID="2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc" Dec 03 06:48:54 crc kubenswrapper[4475]: E1203 06:48:54.988803 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc\": container with ID starting with 2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc not found: ID does not exist" containerID="2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.988825 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc"} err="failed to get container status \"2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc\": rpc error: code = NotFound desc = could not find container \"2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc\": container with ID starting with 2bfaf119407f4b0f1539a6e9269b77d93580b274d0cc224ede9a1a17eb354cdc not found: ID does not exist" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.988842 4475 scope.go:117] "RemoveContainer" containerID="7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.989346 4475 generic.go:334] "Generic (PLEG): container finished" podID="30be3012-ac26-4a64-b650-66174f25549a" containerID="16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002" exitCode=0 Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.989388 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkcc8" event={"ID":"30be3012-ac26-4a64-b650-66174f25549a","Type":"ContainerDied","Data":"16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002"} Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.989408 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkcc8" event={"ID":"30be3012-ac26-4a64-b650-66174f25549a","Type":"ContainerDied","Data":"eaeb08705422ed6be4da4d92c90bd6169de7c10bad17b34b840e5ea35063ef32"} Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.989754 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qkcc8" Dec 03 06:48:54 crc kubenswrapper[4475]: I1203 06:48:54.991177 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6ab24e6-93f2-46dd-aace-3de3344bd9f1" (UID: "c6ab24e6-93f2-46dd-aace-3de3344bd9f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.004762 4475 generic.go:334] "Generic (PLEG): container finished" podID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" containerID="3e244226b044d928f00d8f988546d19f2c95eb36fe6bbf9807f072f51a9fde21" exitCode=0 Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.004815 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnqbz" event={"ID":"d75bb35d-29b7-4994-9bc6-756f2950d3fd","Type":"ContainerDied","Data":"3e244226b044d928f00d8f988546d19f2c95eb36fe6bbf9807f072f51a9fde21"} Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.013864 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-catalog-content\") pod \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.014187 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-utilities\") pod \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.014232 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8fx4\" (UniqueName: \"kubernetes.io/projected/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-kube-api-access-q8fx4\") pod \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\" (UID: \"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2\") " Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.014755 4475 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.014769 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.014778 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.014785 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.014794 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s67lb\" (UniqueName: \"kubernetes.io/projected/30be3012-ac26-4a64-b650-66174f25549a-kube-api-access-s67lb\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.014813 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrvnc\" (UniqueName: \"kubernetes.io/projected/7f6163e8-ce0d-481b-8483-4b9e04d381e6-kube-api-access-xrvnc\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.014822 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t46l5\" (UniqueName: \"kubernetes.io/projected/c6ab24e6-93f2-46dd-aace-3de3344bd9f1-kube-api-access-t46l5\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.014829 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30be3012-ac26-4a64-b650-66174f25549a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.014836 4475 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f6163e8-ce0d-481b-8483-4b9e04d381e6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.018017 4475 scope.go:117] "RemoveContainer" containerID="bc032f563674d26479c49ddf885f413a7145e13c0cd90e23d98184c5000db102" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.018547 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.018779 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kfvwc"] Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.021219 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-kube-api-access-q8fx4" (OuterVolumeSpecName: "kube-api-access-q8fx4") pod "ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" (UID: "ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2"). InnerVolumeSpecName "kube-api-access-q8fx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.022059 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kfvwc"] Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.023547 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qkcc8"] Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.023985 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-utilities" (OuterVolumeSpecName: "utilities") pod "ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" (UID: "ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.028019 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qkcc8"] Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.043779 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" (UID: "ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.049507 4475 scope.go:117] "RemoveContainer" containerID="83c685a6bd0714461f311975cbf6222bb99444a2045f4d57a89aeffd226bcac0" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.059294 4475 scope.go:117] "RemoveContainer" containerID="7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8" Dec 03 06:48:55 crc kubenswrapper[4475]: E1203 06:48:55.059646 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8\": container with ID starting with 7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8 not found: ID does not exist" containerID="7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.059678 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8"} err="failed to get container status \"7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8\": rpc error: code = NotFound desc = could not find container \"7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8\": container with ID starting with 7bd9ce7df11ad541da68e53078663b52825e0d135eba8b3beba4f370abc449e8 not found: ID does not exist" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.059700 4475 scope.go:117] "RemoveContainer" containerID="bc032f563674d26479c49ddf885f413a7145e13c0cd90e23d98184c5000db102" Dec 03 06:48:55 crc kubenswrapper[4475]: E1203 06:48:55.059914 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc032f563674d26479c49ddf885f413a7145e13c0cd90e23d98184c5000db102\": container with ID starting with bc032f563674d26479c49ddf885f413a7145e13c0cd90e23d98184c5000db102 not found: ID does not exist" containerID="bc032f563674d26479c49ddf885f413a7145e13c0cd90e23d98184c5000db102" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.059936 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc032f563674d26479c49ddf885f413a7145e13c0cd90e23d98184c5000db102"} err="failed to get container status \"bc032f563674d26479c49ddf885f413a7145e13c0cd90e23d98184c5000db102\": rpc error: code = NotFound desc = could not find container \"bc032f563674d26479c49ddf885f413a7145e13c0cd90e23d98184c5000db102\": container with ID starting with bc032f563674d26479c49ddf885f413a7145e13c0cd90e23d98184c5000db102 not found: ID does not exist" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.059953 4475 scope.go:117] "RemoveContainer" containerID="83c685a6bd0714461f311975cbf6222bb99444a2045f4d57a89aeffd226bcac0" Dec 03 06:48:55 crc kubenswrapper[4475]: E1203 06:48:55.060260 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c685a6bd0714461f311975cbf6222bb99444a2045f4d57a89aeffd226bcac0\": container with ID starting with 83c685a6bd0714461f311975cbf6222bb99444a2045f4d57a89aeffd226bcac0 not found: ID does not exist" containerID="83c685a6bd0714461f311975cbf6222bb99444a2045f4d57a89aeffd226bcac0" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.060337 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c685a6bd0714461f311975cbf6222bb99444a2045f4d57a89aeffd226bcac0"} err="failed to get container status \"83c685a6bd0714461f311975cbf6222bb99444a2045f4d57a89aeffd226bcac0\": rpc error: code = NotFound desc = could not find container \"83c685a6bd0714461f311975cbf6222bb99444a2045f4d57a89aeffd226bcac0\": container with ID starting with 83c685a6bd0714461f311975cbf6222bb99444a2045f4d57a89aeffd226bcac0 not found: ID does not exist" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.060381 4475 scope.go:117] "RemoveContainer" containerID="16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.072588 4475 scope.go:117] "RemoveContainer" containerID="9ad61fff554a0d681e164ec5f089f8c1fc3b00b84c4dc35c0bfc134a4c03a1c1" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.085373 4475 scope.go:117] "RemoveContainer" containerID="6d2e38a2d61a5d81e356cfccb9f6914d9e1af1fb59d1da8fe547615a267d12af" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.096410 4475 scope.go:117] "RemoveContainer" containerID="16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002" Dec 03 06:48:55 crc kubenswrapper[4475]: E1203 06:48:55.096727 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002\": container with ID starting with 16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002 not found: ID does not exist" containerID="16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.096751 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002"} err="failed to get container status \"16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002\": rpc error: code = NotFound desc = could not find container \"16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002\": container with ID starting with 16f146b802ca1c7d3fd614b2515a8ab000b9cdd7dc7d787e5218a38085f7a002 not found: ID does not exist" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.096769 4475 scope.go:117] "RemoveContainer" containerID="9ad61fff554a0d681e164ec5f089f8c1fc3b00b84c4dc35c0bfc134a4c03a1c1" Dec 03 06:48:55 crc kubenswrapper[4475]: E1203 06:48:55.096964 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad61fff554a0d681e164ec5f089f8c1fc3b00b84c4dc35c0bfc134a4c03a1c1\": container with ID starting with 9ad61fff554a0d681e164ec5f089f8c1fc3b00b84c4dc35c0bfc134a4c03a1c1 not found: ID does not exist" containerID="9ad61fff554a0d681e164ec5f089f8c1fc3b00b84c4dc35c0bfc134a4c03a1c1" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.096984 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad61fff554a0d681e164ec5f089f8c1fc3b00b84c4dc35c0bfc134a4c03a1c1"} err="failed to get container status \"9ad61fff554a0d681e164ec5f089f8c1fc3b00b84c4dc35c0bfc134a4c03a1c1\": rpc error: code = NotFound desc = could not find container \"9ad61fff554a0d681e164ec5f089f8c1fc3b00b84c4dc35c0bfc134a4c03a1c1\": container with ID starting with 9ad61fff554a0d681e164ec5f089f8c1fc3b00b84c4dc35c0bfc134a4c03a1c1 not found: ID does not exist" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.096997 4475 scope.go:117] "RemoveContainer" containerID="6d2e38a2d61a5d81e356cfccb9f6914d9e1af1fb59d1da8fe547615a267d12af" Dec 03 06:48:55 crc kubenswrapper[4475]: E1203 06:48:55.097152 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d2e38a2d61a5d81e356cfccb9f6914d9e1af1fb59d1da8fe547615a267d12af\": container with ID starting with 6d2e38a2d61a5d81e356cfccb9f6914d9e1af1fb59d1da8fe547615a267d12af not found: ID does not exist" containerID="6d2e38a2d61a5d81e356cfccb9f6914d9e1af1fb59d1da8fe547615a267d12af" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.097171 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d2e38a2d61a5d81e356cfccb9f6914d9e1af1fb59d1da8fe547615a267d12af"} err="failed to get container status \"6d2e38a2d61a5d81e356cfccb9f6914d9e1af1fb59d1da8fe547615a267d12af\": rpc error: code = NotFound desc = could not find container \"6d2e38a2d61a5d81e356cfccb9f6914d9e1af1fb59d1da8fe547615a267d12af\": container with ID starting with 6d2e38a2d61a5d81e356cfccb9f6914d9e1af1fb59d1da8fe547615a267d12af not found: ID does not exist" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.115096 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-utilities\") pod \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.115197 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-catalog-content\") pod \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.115219 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhvzm\" (UniqueName: \"kubernetes.io/projected/d75bb35d-29b7-4994-9bc6-756f2950d3fd-kube-api-access-mhvzm\") pod \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\" (UID: \"d75bb35d-29b7-4994-9bc6-756f2950d3fd\") " Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.115656 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-utilities" (OuterVolumeSpecName: "utilities") pod "d75bb35d-29b7-4994-9bc6-756f2950d3fd" (UID: "d75bb35d-29b7-4994-9bc6-756f2950d3fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.124404 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.124434 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8fx4\" (UniqueName: \"kubernetes.io/projected/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-kube-api-access-q8fx4\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.124445 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.124466 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.127588 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d75bb35d-29b7-4994-9bc6-756f2950d3fd-kube-api-access-mhvzm" (OuterVolumeSpecName: "kube-api-access-mhvzm") pod "d75bb35d-29b7-4994-9bc6-756f2950d3fd" (UID: "d75bb35d-29b7-4994-9bc6-756f2950d3fd"). InnerVolumeSpecName "kube-api-access-mhvzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.197711 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d75bb35d-29b7-4994-9bc6-756f2950d3fd" (UID: "d75bb35d-29b7-4994-9bc6-756f2950d3fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.225529 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d75bb35d-29b7-4994-9bc6-756f2950d3fd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.225557 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhvzm\" (UniqueName: \"kubernetes.io/projected/d75bb35d-29b7-4994-9bc6-756f2950d3fd-kube-api-access-mhvzm\") on node \"crc\" DevicePath \"\"" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.273522 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8xfd"] Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.334473 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xsp8w"] Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.336956 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xsp8w"] Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.495716 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30be3012-ac26-4a64-b650-66174f25549a" path="/var/lib/kubelet/pods/30be3012-ac26-4a64-b650-66174f25549a/volumes" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.496967 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6163e8-ce0d-481b-8483-4b9e04d381e6" path="/var/lib/kubelet/pods/7f6163e8-ce0d-481b-8483-4b9e04d381e6/volumes" Dec 03 06:48:55 crc kubenswrapper[4475]: I1203 06:48:55.497441 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" path="/var/lib/kubelet/pods/c6ab24e6-93f2-46dd-aace-3de3344bd9f1/volumes" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.010076 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8f5d" event={"ID":"ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2","Type":"ContainerDied","Data":"29ac8bcf2e100c2987b9604e671bfd38448d01af63b6105a0a6129c9df666a2d"} Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.010112 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q8f5d" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.010338 4475 scope.go:117] "RemoveContainer" containerID="68b043e0c00f78036582cc13706337b074ca4719eb6d07684ef40e9184cba405" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.020776 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnqbz" event={"ID":"d75bb35d-29b7-4994-9bc6-756f2950d3fd","Type":"ContainerDied","Data":"382354460bf86b5d7d60696354b8158e09287df82f5b8cd328c22b074ac5530d"} Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.020864 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnqbz" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.021573 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8f5d"] Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.024115 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" event={"ID":"72e92f7d-5ec5-43d3-b81a-df695e7adbde","Type":"ContainerStarted","Data":"216de12c7d2ed60c141daa47e3d53dd18eeab2fa902262293df735fb129b0ac8"} Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.024149 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" event={"ID":"72e92f7d-5ec5-43d3-b81a-df695e7adbde","Type":"ContainerStarted","Data":"0d32d6baa27e7ea50575b4c1a1bfcbdf6d0812b4eec6386f270fd5858d11fad1"} Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.024372 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.025100 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8f5d"] Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.031349 4475 scope.go:117] "RemoveContainer" containerID="ff0efc5306a44c6d5fe6fd24034d75680d489ca6a41761a6f4233651b7558ac5" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.033610 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.037413 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p8xfd" podStartSLOduration=2.037405288 podStartE2EDuration="2.037405288s" podCreationTimestamp="2025-12-03 06:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:48:56.035978774 +0000 UTC m=+220.840877107" watchObservedRunningTime="2025-12-03 06:48:56.037405288 +0000 UTC m=+220.842303622" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.053897 4475 scope.go:117] "RemoveContainer" containerID="1359b6dad0c2a395d812df37d975820ee838952a1570715f61f712b7761e7a9e" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.064588 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnqbz"] Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.070497 4475 scope.go:117] "RemoveContainer" containerID="3e244226b044d928f00d8f988546d19f2c95eb36fe6bbf9807f072f51a9fde21" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.072645 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tnqbz"] Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.081342 4475 scope.go:117] "RemoveContainer" containerID="df9e540befd0b1960aa998db8253a8f0e2ded361e5e3177531a13be09d8b6a49" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.093713 4475 scope.go:117] "RemoveContainer" containerID="829b0bb16a1b8e69d92eb455d12d02cc38946b63e5109556b2dbc8d04737bdff" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703341 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8kjrv"] Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703512 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" containerName="extract-utilities" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703523 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" containerName="extract-utilities" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703532 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30be3012-ac26-4a64-b650-66174f25549a" containerName="extract-utilities" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703537 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="30be3012-ac26-4a64-b650-66174f25549a" containerName="extract-utilities" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703545 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30be3012-ac26-4a64-b650-66174f25549a" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703552 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="30be3012-ac26-4a64-b650-66174f25549a" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703559 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" containerName="extract-utilities" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703565 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" containerName="extract-utilities" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703573 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703578 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703586 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" containerName="extract-content" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703591 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" containerName="extract-content" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703599 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" containerName="extract-content" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703604 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" containerName="extract-content" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703613 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" containerName="extract-content" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703618 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" containerName="extract-content" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703623 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703628 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703635 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" containerName="extract-utilities" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703640 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" containerName="extract-utilities" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703648 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6163e8-ce0d-481b-8483-4b9e04d381e6" containerName="marketplace-operator" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703654 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6163e8-ce0d-481b-8483-4b9e04d381e6" containerName="marketplace-operator" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703660 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30be3012-ac26-4a64-b650-66174f25549a" containerName="extract-content" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703665 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="30be3012-ac26-4a64-b650-66174f25549a" containerName="extract-content" Dec 03 06:48:56 crc kubenswrapper[4475]: E1203 06:48:56.703671 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703676 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703742 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703762 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703769 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="30be3012-ac26-4a64-b650-66174f25549a" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703778 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ab24e6-93f2-46dd-aace-3de3344bd9f1" containerName="registry-server" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.703786 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f6163e8-ce0d-481b-8483-4b9e04d381e6" containerName="marketplace-operator" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.704313 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.709018 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kjrv"] Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.709153 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.742054 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbe13ae-894d-43d5-b757-f4e8f61ab84f-catalog-content\") pod \"redhat-marketplace-8kjrv\" (UID: \"bfbe13ae-894d-43d5-b757-f4e8f61ab84f\") " pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.742098 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fps4\" (UniqueName: \"kubernetes.io/projected/bfbe13ae-894d-43d5-b757-f4e8f61ab84f-kube-api-access-5fps4\") pod \"redhat-marketplace-8kjrv\" (UID: \"bfbe13ae-894d-43d5-b757-f4e8f61ab84f\") " pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.742120 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbe13ae-894d-43d5-b757-f4e8f61ab84f-utilities\") pod \"redhat-marketplace-8kjrv\" (UID: \"bfbe13ae-894d-43d5-b757-f4e8f61ab84f\") " pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.843259 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fps4\" (UniqueName: \"kubernetes.io/projected/bfbe13ae-894d-43d5-b757-f4e8f61ab84f-kube-api-access-5fps4\") pod \"redhat-marketplace-8kjrv\" (UID: \"bfbe13ae-894d-43d5-b757-f4e8f61ab84f\") " pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.843533 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbe13ae-894d-43d5-b757-f4e8f61ab84f-utilities\") pod \"redhat-marketplace-8kjrv\" (UID: \"bfbe13ae-894d-43d5-b757-f4e8f61ab84f\") " pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.843711 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbe13ae-894d-43d5-b757-f4e8f61ab84f-catalog-content\") pod \"redhat-marketplace-8kjrv\" (UID: \"bfbe13ae-894d-43d5-b757-f4e8f61ab84f\") " pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.843850 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbe13ae-894d-43d5-b757-f4e8f61ab84f-utilities\") pod \"redhat-marketplace-8kjrv\" (UID: \"bfbe13ae-894d-43d5-b757-f4e8f61ab84f\") " pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.844051 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbe13ae-894d-43d5-b757-f4e8f61ab84f-catalog-content\") pod \"redhat-marketplace-8kjrv\" (UID: \"bfbe13ae-894d-43d5-b757-f4e8f61ab84f\") " pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.862022 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fps4\" (UniqueName: \"kubernetes.io/projected/bfbe13ae-894d-43d5-b757-f4e8f61ab84f-kube-api-access-5fps4\") pod \"redhat-marketplace-8kjrv\" (UID: \"bfbe13ae-894d-43d5-b757-f4e8f61ab84f\") " pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.903571 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s8xvs"] Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.904494 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.907968 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.909308 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8xvs"] Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.944564 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b72b6a-e6f1-4e6d-bf25-0e8427834dd8-utilities\") pod \"redhat-operators-s8xvs\" (UID: \"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8\") " pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.944811 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b72b6a-e6f1-4e6d-bf25-0e8427834dd8-catalog-content\") pod \"redhat-operators-s8xvs\" (UID: \"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8\") " pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:48:56 crc kubenswrapper[4475]: I1203 06:48:56.944866 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf488\" (UniqueName: \"kubernetes.io/projected/71b72b6a-e6f1-4e6d-bf25-0e8427834dd8-kube-api-access-gf488\") pod \"redhat-operators-s8xvs\" (UID: \"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8\") " pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.016472 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.045517 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b72b6a-e6f1-4e6d-bf25-0e8427834dd8-utilities\") pod \"redhat-operators-s8xvs\" (UID: \"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8\") " pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.045597 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b72b6a-e6f1-4e6d-bf25-0e8427834dd8-catalog-content\") pod \"redhat-operators-s8xvs\" (UID: \"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8\") " pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.045654 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf488\" (UniqueName: \"kubernetes.io/projected/71b72b6a-e6f1-4e6d-bf25-0e8427834dd8-kube-api-access-gf488\") pod \"redhat-operators-s8xvs\" (UID: \"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8\") " pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.046143 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b72b6a-e6f1-4e6d-bf25-0e8427834dd8-utilities\") pod \"redhat-operators-s8xvs\" (UID: \"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8\") " pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.046433 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b72b6a-e6f1-4e6d-bf25-0e8427834dd8-catalog-content\") pod \"redhat-operators-s8xvs\" (UID: \"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8\") " pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.060439 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf488\" (UniqueName: \"kubernetes.io/projected/71b72b6a-e6f1-4e6d-bf25-0e8427834dd8-kube-api-access-gf488\") pod \"redhat-operators-s8xvs\" (UID: \"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8\") " pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.216242 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.360821 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kjrv"] Dec 03 06:48:57 crc kubenswrapper[4475]: W1203 06:48:57.369166 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfbe13ae_894d_43d5_b757_f4e8f61ab84f.slice/crio-43f6da78e3034ba6430ee7b606cc0f9b99caebca3f9edc539be38b07c84dbf85 WatchSource:0}: Error finding container 43f6da78e3034ba6430ee7b606cc0f9b99caebca3f9edc539be38b07c84dbf85: Status 404 returned error can't find the container with id 43f6da78e3034ba6430ee7b606cc0f9b99caebca3f9edc539be38b07c84dbf85 Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.495279 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d75bb35d-29b7-4994-9bc6-756f2950d3fd" path="/var/lib/kubelet/pods/d75bb35d-29b7-4994-9bc6-756f2950d3fd/volumes" Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.495927 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2" path="/var/lib/kubelet/pods/ece5cd8d-8d3a-48ab-b2dc-194339fb1ff2/volumes" Dec 03 06:48:57 crc kubenswrapper[4475]: I1203 06:48:57.536383 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8xvs"] Dec 03 06:48:57 crc kubenswrapper[4475]: W1203 06:48:57.541245 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b72b6a_e6f1_4e6d_bf25_0e8427834dd8.slice/crio-3d6e36903cb67a163f46beb7d774439baa51065e8c46ed800608bd5fdb3f303c WatchSource:0}: Error finding container 3d6e36903cb67a163f46beb7d774439baa51065e8c46ed800608bd5fdb3f303c: Status 404 returned error can't find the container with id 3d6e36903cb67a163f46beb7d774439baa51065e8c46ed800608bd5fdb3f303c Dec 03 06:48:58 crc kubenswrapper[4475]: I1203 06:48:58.034367 4475 generic.go:334] "Generic (PLEG): container finished" podID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" containerID="27421bb2021d5b6186127d6a076ced9ee51f685470a604f093b3bbfc5dee027e" exitCode=0 Dec 03 06:48:58 crc kubenswrapper[4475]: I1203 06:48:58.034417 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xvs" event={"ID":"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8","Type":"ContainerDied","Data":"27421bb2021d5b6186127d6a076ced9ee51f685470a604f093b3bbfc5dee027e"} Dec 03 06:48:58 crc kubenswrapper[4475]: I1203 06:48:58.034440 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xvs" event={"ID":"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8","Type":"ContainerStarted","Data":"3d6e36903cb67a163f46beb7d774439baa51065e8c46ed800608bd5fdb3f303c"} Dec 03 06:48:58 crc kubenswrapper[4475]: I1203 06:48:58.038198 4475 generic.go:334] "Generic (PLEG): container finished" podID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" containerID="690929d73dc5b214f3ddf9dd09a9fdf12080c54b0c146cdbe4969f035803278f" exitCode=0 Dec 03 06:48:58 crc kubenswrapper[4475]: I1203 06:48:58.038344 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kjrv" event={"ID":"bfbe13ae-894d-43d5-b757-f4e8f61ab84f","Type":"ContainerDied","Data":"690929d73dc5b214f3ddf9dd09a9fdf12080c54b0c146cdbe4969f035803278f"} Dec 03 06:48:58 crc kubenswrapper[4475]: I1203 06:48:58.038375 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kjrv" event={"ID":"bfbe13ae-894d-43d5-b757-f4e8f61ab84f","Type":"ContainerStarted","Data":"43f6da78e3034ba6430ee7b606cc0f9b99caebca3f9edc539be38b07c84dbf85"} Dec 03 06:48:58 crc kubenswrapper[4475]: I1203 06:48:58.933538 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:48:58 crc kubenswrapper[4475]: I1203 06:48:58.933754 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:48:58 crc kubenswrapper[4475]: I1203 06:48:58.933806 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:48:58 crc kubenswrapper[4475]: I1203 06:48:58.934228 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:48:58 crc kubenswrapper[4475]: I1203 06:48:58.934282 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e" gracePeriod=600 Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.043977 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xvs" event={"ID":"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8","Type":"ContainerStarted","Data":"c18b362227217ac737ca05ebbf96ea167673f72e92ff02680e73d1ddcdd0649b"} Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.045255 4475 generic.go:334] "Generic (PLEG): container finished" podID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" containerID="58f0caf84d04322ec43a77d3af7be30f2a48be0c312be7925c50485c2eda0a0a" exitCode=0 Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.045304 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kjrv" event={"ID":"bfbe13ae-894d-43d5-b757-f4e8f61ab84f","Type":"ContainerDied","Data":"58f0caf84d04322ec43a77d3af7be30f2a48be0c312be7925c50485c2eda0a0a"} Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.100355 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gdfwt"] Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.101188 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.103616 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.144242 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdfwt"] Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.171046 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-catalog-content\") pod \"certified-operators-gdfwt\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.171089 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-utilities\") pod \"certified-operators-gdfwt\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.171132 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb6wh\" (UniqueName: \"kubernetes.io/projected/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-kube-api-access-lb6wh\") pod \"certified-operators-gdfwt\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.272279 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb6wh\" (UniqueName: \"kubernetes.io/projected/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-kube-api-access-lb6wh\") pod \"certified-operators-gdfwt\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.272659 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-catalog-content\") pod \"certified-operators-gdfwt\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.272707 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-utilities\") pod \"certified-operators-gdfwt\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.273069 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-utilities\") pod \"certified-operators-gdfwt\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.273297 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-catalog-content\") pod \"certified-operators-gdfwt\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.288731 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb6wh\" (UniqueName: \"kubernetes.io/projected/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-kube-api-access-lb6wh\") pod \"certified-operators-gdfwt\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.302475 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s8gdr"] Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.303395 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.306199 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.315134 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s8gdr"] Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.373737 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbm9x\" (UniqueName: \"kubernetes.io/projected/495a50fc-19f4-49e9-a195-196e75ebf30f-kube-api-access-hbm9x\") pod \"community-operators-s8gdr\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.373953 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-catalog-content\") pod \"community-operators-s8gdr\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.373986 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-utilities\") pod \"community-operators-s8gdr\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.419784 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.475275 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbm9x\" (UniqueName: \"kubernetes.io/projected/495a50fc-19f4-49e9-a195-196e75ebf30f-kube-api-access-hbm9x\") pod \"community-operators-s8gdr\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.475341 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-catalog-content\") pod \"community-operators-s8gdr\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.475368 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-utilities\") pod \"community-operators-s8gdr\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.475858 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-utilities\") pod \"community-operators-s8gdr\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.476270 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-catalog-content\") pod \"community-operators-s8gdr\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.490437 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbm9x\" (UniqueName: \"kubernetes.io/projected/495a50fc-19f4-49e9-a195-196e75ebf30f-kube-api-access-hbm9x\") pod \"community-operators-s8gdr\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.614433 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.767801 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdfwt"] Dec 03 06:48:59 crc kubenswrapper[4475]: I1203 06:48:59.788312 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s8gdr"] Dec 03 06:48:59 crc kubenswrapper[4475]: W1203 06:48:59.795371 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod495a50fc_19f4_49e9_a195_196e75ebf30f.slice/crio-a9ec87a32bf6edb894551fb5db41ba1323b458c07b1f91d23a445254f6310291 WatchSource:0}: Error finding container a9ec87a32bf6edb894551fb5db41ba1323b458c07b1f91d23a445254f6310291: Status 404 returned error can't find the container with id a9ec87a32bf6edb894551fb5db41ba1323b458c07b1f91d23a445254f6310291 Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.051424 4475 generic.go:334] "Generic (PLEG): container finished" podID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" containerID="76398f13e878594524d44423553f9c1cdc5cd421daf181f5b4f467b4f440350e" exitCode=0 Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.051504 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdfwt" event={"ID":"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057","Type":"ContainerDied","Data":"76398f13e878594524d44423553f9c1cdc5cd421daf181f5b4f467b4f440350e"} Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.051815 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdfwt" event={"ID":"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057","Type":"ContainerStarted","Data":"1809e8178c41f89736e0c5e1fdd5d22819ca8417375a1b0db967423ded4c63fe"} Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.054012 4475 generic.go:334] "Generic (PLEG): container finished" podID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" containerID="c18b362227217ac737ca05ebbf96ea167673f72e92ff02680e73d1ddcdd0649b" exitCode=0 Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.054070 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xvs" event={"ID":"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8","Type":"ContainerDied","Data":"c18b362227217ac737ca05ebbf96ea167673f72e92ff02680e73d1ddcdd0649b"} Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.058160 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e" exitCode=0 Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.058203 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e"} Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.058222 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"6de2d401c62c0b82b84c560e7fbdf0f3aa849cd94b4d5542285bedcc76efb375"} Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.059969 4475 generic.go:334] "Generic (PLEG): container finished" podID="495a50fc-19f4-49e9-a195-196e75ebf30f" containerID="f98b4cc7559ab2a2c445d6d08920b8953c8aca7c315446bbb70bb55a34616a01" exitCode=0 Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.060009 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8gdr" event={"ID":"495a50fc-19f4-49e9-a195-196e75ebf30f","Type":"ContainerDied","Data":"f98b4cc7559ab2a2c445d6d08920b8953c8aca7c315446bbb70bb55a34616a01"} Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.060025 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8gdr" event={"ID":"495a50fc-19f4-49e9-a195-196e75ebf30f","Type":"ContainerStarted","Data":"a9ec87a32bf6edb894551fb5db41ba1323b458c07b1f91d23a445254f6310291"} Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.066648 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kjrv" event={"ID":"bfbe13ae-894d-43d5-b757-f4e8f61ab84f","Type":"ContainerStarted","Data":"34302757fcb426118f46fbf96e053c315c5e41b575cf29ec576f3e9aa944ea7a"} Dec 03 06:49:00 crc kubenswrapper[4475]: I1203 06:49:00.103326 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8kjrv" podStartSLOduration=2.567109835 podStartE2EDuration="4.103305s" podCreationTimestamp="2025-12-03 06:48:56 +0000 UTC" firstStartedPulling="2025-12-03 06:48:58.039522948 +0000 UTC m=+222.844421281" lastFinishedPulling="2025-12-03 06:48:59.575718113 +0000 UTC m=+224.380616446" observedRunningTime="2025-12-03 06:49:00.102838536 +0000 UTC m=+224.907736869" watchObservedRunningTime="2025-12-03 06:49:00.103305 +0000 UTC m=+224.908203334" Dec 03 06:49:01 crc kubenswrapper[4475]: I1203 06:49:01.072077 4475 generic.go:334] "Generic (PLEG): container finished" podID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" containerID="bbce2bda2f220b0d5016adce6fbfe8a2362648a0dca7c52d02760f6db39096eb" exitCode=0 Dec 03 06:49:01 crc kubenswrapper[4475]: I1203 06:49:01.072273 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdfwt" event={"ID":"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057","Type":"ContainerDied","Data":"bbce2bda2f220b0d5016adce6fbfe8a2362648a0dca7c52d02760f6db39096eb"} Dec 03 06:49:01 crc kubenswrapper[4475]: I1203 06:49:01.076879 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xvs" event={"ID":"71b72b6a-e6f1-4e6d-bf25-0e8427834dd8","Type":"ContainerStarted","Data":"8162322bcb1178440e6b3d09e041a953f23ecb00a981ddec0b525e590242227c"} Dec 03 06:49:01 crc kubenswrapper[4475]: I1203 06:49:01.078657 4475 generic.go:334] "Generic (PLEG): container finished" podID="495a50fc-19f4-49e9-a195-196e75ebf30f" containerID="013de5e5f407ac12eca079d900b3f766064f6cc13fd70080b6c285f545d161d1" exitCode=0 Dec 03 06:49:01 crc kubenswrapper[4475]: I1203 06:49:01.079526 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8gdr" event={"ID":"495a50fc-19f4-49e9-a195-196e75ebf30f","Type":"ContainerDied","Data":"013de5e5f407ac12eca079d900b3f766064f6cc13fd70080b6c285f545d161d1"} Dec 03 06:49:01 crc kubenswrapper[4475]: I1203 06:49:01.116600 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s8xvs" podStartSLOduration=2.549899109 podStartE2EDuration="5.116585449s" podCreationTimestamp="2025-12-03 06:48:56 +0000 UTC" firstStartedPulling="2025-12-03 06:48:58.035608156 +0000 UTC m=+222.840506490" lastFinishedPulling="2025-12-03 06:49:00.602294496 +0000 UTC m=+225.407192830" observedRunningTime="2025-12-03 06:49:01.111684678 +0000 UTC m=+225.916583013" watchObservedRunningTime="2025-12-03 06:49:01.116585449 +0000 UTC m=+225.921483784" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.090414 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8gdr" event={"ID":"495a50fc-19f4-49e9-a195-196e75ebf30f","Type":"ContainerStarted","Data":"f0a1cb37c3c99a18ef363d8ef0cf1de9f693eaa36913ec74adcf9d6633ab9ab5"} Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.092171 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdfwt" event={"ID":"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057","Type":"ContainerStarted","Data":"6d7a30030a316444407a9c792ae2b81316fc2d6f2c38f2f26fc4f915e135e6a4"} Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.104794 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s8gdr" podStartSLOduration=2.6149283739999998 podStartE2EDuration="4.104781931s" podCreationTimestamp="2025-12-03 06:48:59 +0000 UTC" firstStartedPulling="2025-12-03 06:49:00.060770011 +0000 UTC m=+224.865668346" lastFinishedPulling="2025-12-03 06:49:01.550623569 +0000 UTC m=+226.355521903" observedRunningTime="2025-12-03 06:49:03.102730163 +0000 UTC m=+227.907628497" watchObservedRunningTime="2025-12-03 06:49:03.104781931 +0000 UTC m=+227.909680266" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.118681 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gdfwt" podStartSLOduration=2.582974869 podStartE2EDuration="4.118668554s" podCreationTimestamp="2025-12-03 06:48:59 +0000 UTC" firstStartedPulling="2025-12-03 06:49:00.052729833 +0000 UTC m=+224.857628167" lastFinishedPulling="2025-12-03 06:49:01.588423527 +0000 UTC m=+226.393321852" observedRunningTime="2025-12-03 06:49:03.118498365 +0000 UTC m=+227.923396699" watchObservedRunningTime="2025-12-03 06:49:03.118668554 +0000 UTC m=+227.923566888" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.821199 4475 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.821755 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.822320 4475 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.822508 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0" gracePeriod=15 Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.822618 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e" gracePeriod=15 Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.822653 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa" gracePeriod=15 Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.822683 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb" gracePeriod=15 Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.822709 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de" gracePeriod=15 Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823242 4475 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:49:03 crc kubenswrapper[4475]: E1203 06:49:03.823439 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823464 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:49:03 crc kubenswrapper[4475]: E1203 06:49:03.823475 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823481 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 06:49:03 crc kubenswrapper[4475]: E1203 06:49:03.823489 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823495 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 06:49:03 crc kubenswrapper[4475]: E1203 06:49:03.823500 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823505 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 06:49:03 crc kubenswrapper[4475]: E1203 06:49:03.823514 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823520 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 06:49:03 crc kubenswrapper[4475]: E1203 06:49:03.823527 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823533 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823615 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823623 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823631 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823637 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.823648 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.922918 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.922980 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.922997 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.923044 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.923061 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.923111 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.923127 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:03 crc kubenswrapper[4475]: I1203 06:49:03.923141 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024346 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024598 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024460 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024620 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024667 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024690 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024727 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024760 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024765 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024740 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024807 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024849 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024871 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024909 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024922 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.024929 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.098940 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.099535 4475 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e" exitCode=0 Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.099557 4475 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa" exitCode=0 Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.099564 4475 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb" exitCode=0 Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.099572 4475 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de" exitCode=2 Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.100991 4475 generic.go:334] "Generic (PLEG): container finished" podID="333f027c-627f-4107-93a2-f522a583a5ed" containerID="2af3c7a09ed5c49c67d4fdc0b8d8a5e2eb11d1683871ab83696f0c6b3a36e157" exitCode=0 Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.101077 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"333f027c-627f-4107-93a2-f522a583a5ed","Type":"ContainerDied","Data":"2af3c7a09ed5c49c67d4fdc0b8d8a5e2eb11d1683871ab83696f0c6b3a36e157"} Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.102148 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:04 crc kubenswrapper[4475]: I1203 06:49:04.102375 4475 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.280290 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.280780 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.280950 4475 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.339669 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-var-lock\") pod \"333f027c-627f-4107-93a2-f522a583a5ed\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.339712 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-kubelet-dir\") pod \"333f027c-627f-4107-93a2-f522a583a5ed\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.339735 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333f027c-627f-4107-93a2-f522a583a5ed-kube-api-access\") pod \"333f027c-627f-4107-93a2-f522a583a5ed\" (UID: \"333f027c-627f-4107-93a2-f522a583a5ed\") " Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.339787 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-var-lock" (OuterVolumeSpecName: "var-lock") pod "333f027c-627f-4107-93a2-f522a583a5ed" (UID: "333f027c-627f-4107-93a2-f522a583a5ed"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.339812 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "333f027c-627f-4107-93a2-f522a583a5ed" (UID: "333f027c-627f-4107-93a2-f522a583a5ed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.340030 4475 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.340051 4475 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/333f027c-627f-4107-93a2-f522a583a5ed-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.344547 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333f027c-627f-4107-93a2-f522a583a5ed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "333f027c-627f-4107-93a2-f522a583a5ed" (UID: "333f027c-627f-4107-93a2-f522a583a5ed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.441648 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333f027c-627f-4107-93a2-f522a583a5ed-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.493198 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:05 crc kubenswrapper[4475]: I1203 06:49:05.493984 4475 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:06 crc kubenswrapper[4475]: I1203 06:49:06.110326 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"333f027c-627f-4107-93a2-f522a583a5ed","Type":"ContainerDied","Data":"ad69205d75c1fed21d9ade1b846da0d22c2cfe3d8c8f5fd3bdf6815f06f39914"} Dec 03 06:49:06 crc kubenswrapper[4475]: I1203 06:49:06.110659 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad69205d75c1fed21d9ade1b846da0d22c2cfe3d8c8f5fd3bdf6815f06f39914" Dec 03 06:49:06 crc kubenswrapper[4475]: I1203 06:49:06.110364 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:49:06 crc kubenswrapper[4475]: I1203 06:49:06.113272 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.017521 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.018230 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.045555 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.045862 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.046122 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.117940 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.118792 4475 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0" exitCode=0 Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.145684 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8kjrv" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.146034 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.146318 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.216905 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.216937 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.242505 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.242880 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.243119 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.243314 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.277177 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.278238 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.278879 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.279208 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.279424 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.279638 4475 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.362206 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.362260 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.362294 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.362313 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.362354 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.362434 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.362656 4475 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.362675 4475 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.362684 4475 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:07 crc kubenswrapper[4475]: I1203 06:49:07.495648 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.124007 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.124966 4475 scope.go:117] "RemoveContainer" containerID="7fc0ee9e5a408a0a9e701afaf1db7bc3f58fd1830044730e9c680664642b5e4e" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.125876 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.126436 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.126870 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.127106 4475 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.127351 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.128835 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.129030 4475 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.129233 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.129666 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.136652 4475 scope.go:117] "RemoveContainer" containerID="1dd8bd42f01469966b55416fc8af1dd71d341c774263bb3a56190af4cd9e7daa" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.145198 4475 scope.go:117] "RemoveContainer" containerID="d66a9136874b2e25c94cd291aa6d7f4694ac409f16766fd69c8aab8068a441fb" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.152929 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s8xvs" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.153378 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.154025 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.154238 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.154440 4475 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.154463 4475 scope.go:117] "RemoveContainer" containerID="e40c4f29925f494c0f5f01e2ecbcd2e4db2a5f3911a55a874c6d0006f01982de" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.163153 4475 scope.go:117] "RemoveContainer" containerID="822cdbfb2e81d80c5de0253daa42f2a5c89e9cd0eb8a5c3cf620780d17f9a6d0" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.173408 4475 scope.go:117] "RemoveContainer" containerID="5da1155d7b5e933e5db3acc4c1a3fa1b3b90fd79289641f9a3d1290956128628" Dec 03 06:49:08 crc kubenswrapper[4475]: E1203 06:49:08.560845 4475 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 192.168.25.177:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" volumeName="registry-storage" Dec 03 06:49:08 crc kubenswrapper[4475]: E1203 06:49:08.847485 4475 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.25.177:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:08 crc kubenswrapper[4475]: I1203 06:49:08.847848 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:08 crc kubenswrapper[4475]: W1203 06:49:08.861948 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6bcffc242d19d4fe1375949af49cabe4613962e74ac9b58c09615934b8798ce0 WatchSource:0}: Error finding container 6bcffc242d19d4fe1375949af49cabe4613962e74ac9b58c09615934b8798ce0: Status 404 returned error can't find the container with id 6bcffc242d19d4fe1375949af49cabe4613962e74ac9b58c09615934b8798ce0 Dec 03 06:49:08 crc kubenswrapper[4475]: E1203 06:49:08.864786 4475 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.25.177:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187da1d132606a76 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 06:49:08.864518774 +0000 UTC m=+233.669417108,LastTimestamp:2025-12-03 06:49:08.864518774 +0000 UTC m=+233.669417108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.131021 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"25b786b283d45273e7bef30e6323acbe75160cb93bdc1a7be492799d461604cf"} Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.131212 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6bcffc242d19d4fe1375949af49cabe4613962e74ac9b58c09615934b8798ce0"} Dec 03 06:49:09 crc kubenswrapper[4475]: E1203 06:49:09.131695 4475 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.25.177:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.131756 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.132145 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.132338 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.132582 4475 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: E1203 06:49:09.364289 4475 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: E1203 06:49:09.364561 4475 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: E1203 06:49:09.364844 4475 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: E1203 06:49:09.365071 4475 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: E1203 06:49:09.365310 4475 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.365338 4475 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 06:49:09 crc kubenswrapper[4475]: E1203 06:49:09.365574 4475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" interval="200ms" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.420269 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.420302 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.447522 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.447828 4475 status_manager.go:851] "Failed to get status for pod" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" pod="openshift-marketplace/certified-operators-gdfwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-gdfwt\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.448112 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.449277 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.449447 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.449610 4475 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: E1203 06:49:09.566768 4475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" interval="400ms" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.615373 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.615408 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.640792 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.641069 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.642091 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.642382 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.643180 4475 status_manager.go:851] "Failed to get status for pod" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" pod="openshift-marketplace/community-operators-s8gdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s8gdr\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: I1203 06:49:09.643504 4475 status_manager.go:851] "Failed to get status for pod" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" pod="openshift-marketplace/certified-operators-gdfwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-gdfwt\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:09 crc kubenswrapper[4475]: E1203 06:49:09.967682 4475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" interval="800ms" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.158220 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s8gdr" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.158568 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.158738 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.158907 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.159065 4475 status_manager.go:851] "Failed to get status for pod" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" pod="openshift-marketplace/community-operators-s8gdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s8gdr\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.159219 4475 status_manager.go:851] "Failed to get status for pod" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" pod="openshift-marketplace/certified-operators-gdfwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-gdfwt\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.159636 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.159911 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.160068 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.160230 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.160399 4475 status_manager.go:851] "Failed to get status for pod" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" pod="openshift-marketplace/community-operators-s8gdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s8gdr\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:10 crc kubenswrapper[4475]: I1203 06:49:10.160588 4475 status_manager.go:851] "Failed to get status for pod" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" pod="openshift-marketplace/certified-operators-gdfwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-gdfwt\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:10 crc kubenswrapper[4475]: E1203 06:49:10.768614 4475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" interval="1.6s" Dec 03 06:49:12 crc kubenswrapper[4475]: E1203 06:49:12.369548 4475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.177:6443: connect: connection refused" interval="3.2s" Dec 03 06:49:14 crc kubenswrapper[4475]: I1203 06:49:14.490702 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:14 crc kubenswrapper[4475]: I1203 06:49:14.491783 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:14 crc kubenswrapper[4475]: I1203 06:49:14.492035 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:14 crc kubenswrapper[4475]: I1203 06:49:14.492672 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:14 crc kubenswrapper[4475]: I1203 06:49:14.492909 4475 status_manager.go:851] "Failed to get status for pod" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" pod="openshift-marketplace/community-operators-s8gdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s8gdr\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:14 crc kubenswrapper[4475]: I1203 06:49:14.493206 4475 status_manager.go:851] "Failed to get status for pod" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" pod="openshift-marketplace/certified-operators-gdfwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-gdfwt\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:14 crc kubenswrapper[4475]: I1203 06:49:14.500910 4475 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:14 crc kubenswrapper[4475]: I1203 06:49:14.500930 4475 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:14 crc kubenswrapper[4475]: E1203 06:49:14.501175 4475 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:14 crc kubenswrapper[4475]: I1203 06:49:14.501611 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.150944 4475 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="fd61a62edaa4314904f88819a345561b8d74163e987260ae76679a12c1881a43" exitCode=0 Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.151052 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"fd61a62edaa4314904f88819a345561b8d74163e987260ae76679a12c1881a43"} Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.151136 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"87630ae66e3d0cd072e5bcb834353f807256983658f258c0a401463bb70470de"} Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.151827 4475 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.151849 4475 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:15 crc kubenswrapper[4475]: E1203 06:49:15.152063 4475 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.152084 4475 status_manager.go:851] "Failed to get status for pod" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" pod="openshift-marketplace/certified-operators-gdfwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-gdfwt\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.152298 4475 status_manager.go:851] "Failed to get status for pod" podUID="71b72b6a-e6f1-4e6d-bf25-0e8427834dd8" pod="openshift-marketplace/redhat-operators-s8xvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-s8xvs\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.152940 4475 status_manager.go:851] "Failed to get status for pod" podUID="333f027c-627f-4107-93a2-f522a583a5ed" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.153113 4475 status_manager.go:851] "Failed to get status for pod" podUID="bfbe13ae-894d-43d5-b757-f4e8f61ab84f" pod="openshift-marketplace/redhat-marketplace-8kjrv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8kjrv\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.153280 4475 status_manager.go:851] "Failed to get status for pod" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" pod="openshift-marketplace/community-operators-s8gdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s8gdr\": dial tcp 192.168.25.177:6443: connect: connection refused" Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.946115 4475 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 06:49:15 crc kubenswrapper[4475]: I1203 06:49:15.946505 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.156982 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.157021 4475 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608" exitCode=1 Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.157074 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608"} Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.157404 4475 scope.go:117] "RemoveContainer" containerID="fe3e0d5fed18fddd7a1174f7a9f12290ce318e9a0de40fe432c79f6f2e24a608" Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.163001 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dae399349850791e2f5f3a1af3577edcb0e52b4a1b93b0ee78d6e87a64aae96a"} Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.163026 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"435a9b258c86ef0e503fa1bc4986bd2056d084e6c2702a702c809e11678a7e77"} Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.163036 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ae3ca19ed3f39e3468f658b94463a5fcde4ade2dbb1ed367c501202a4ea8cef4"} Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.163045 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"89f70b3434be817e2aa502cc0c40e0583ad6d5df335efbb22d41762883885968"} Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.163054 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e96ffe18f79b4bca4200cc9b9fc079b381ff31c8c7b06508270f2e3e925257d0"} Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.163214 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.163234 4475 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.163245 4475 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:16 crc kubenswrapper[4475]: I1203 06:49:16.547718 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:17 crc kubenswrapper[4475]: I1203 06:49:17.169315 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 06:49:17 crc kubenswrapper[4475]: I1203 06:49:17.169542 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1c147e67359b26b31df5358c2a5ee54ae08a84e82825fadcf21b5f0d43577c32"} Dec 03 06:49:19 crc kubenswrapper[4475]: I1203 06:49:19.502422 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:19 crc kubenswrapper[4475]: I1203 06:49:19.502471 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:19 crc kubenswrapper[4475]: I1203 06:49:19.506510 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:21 crc kubenswrapper[4475]: I1203 06:49:21.441655 4475 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:21 crc kubenswrapper[4475]: I1203 06:49:21.511957 4475 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e7f6b662-e279-406b-ae4f-4478a19909c7" Dec 03 06:49:22 crc kubenswrapper[4475]: I1203 06:49:22.188630 4475 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:22 crc kubenswrapper[4475]: I1203 06:49:22.188657 4475 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:22 crc kubenswrapper[4475]: I1203 06:49:22.190773 4475 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e7f6b662-e279-406b-ae4f-4478a19909c7" Dec 03 06:49:22 crc kubenswrapper[4475]: I1203 06:49:22.191637 4475 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://e96ffe18f79b4bca4200cc9b9fc079b381ff31c8c7b06508270f2e3e925257d0" Dec 03 06:49:22 crc kubenswrapper[4475]: I1203 06:49:22.191656 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:23 crc kubenswrapper[4475]: I1203 06:49:23.191797 4475 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:23 crc kubenswrapper[4475]: I1203 06:49:23.191818 4475 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:23 crc kubenswrapper[4475]: I1203 06:49:23.194137 4475 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e7f6b662-e279-406b-ae4f-4478a19909c7" Dec 03 06:49:25 crc kubenswrapper[4475]: I1203 06:49:25.946220 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:26 crc kubenswrapper[4475]: I1203 06:49:26.547642 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:26 crc kubenswrapper[4475]: I1203 06:49:26.550229 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:27 crc kubenswrapper[4475]: I1203 06:49:27.210649 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:27 crc kubenswrapper[4475]: I1203 06:49:27.962414 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 06:49:28 crc kubenswrapper[4475]: I1203 06:49:28.447159 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 06:49:28 crc kubenswrapper[4475]: I1203 06:49:28.509746 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 06:49:28 crc kubenswrapper[4475]: I1203 06:49:28.824969 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 06:49:29 crc kubenswrapper[4475]: I1203 06:49:29.136725 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:49:29 crc kubenswrapper[4475]: I1203 06:49:29.181341 4475 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 06:49:29 crc kubenswrapper[4475]: I1203 06:49:29.416475 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 06:49:29 crc kubenswrapper[4475]: I1203 06:49:29.549130 4475 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 06:49:29 crc kubenswrapper[4475]: I1203 06:49:29.707300 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 06:49:29 crc kubenswrapper[4475]: I1203 06:49:29.797800 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 06:49:30 crc kubenswrapper[4475]: I1203 06:49:30.093828 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 06:49:30 crc kubenswrapper[4475]: I1203 06:49:30.306403 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 06:49:30 crc kubenswrapper[4475]: I1203 06:49:30.351469 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 06:49:30 crc kubenswrapper[4475]: I1203 06:49:30.680425 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 06:49:31 crc kubenswrapper[4475]: I1203 06:49:31.182788 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 06:49:31 crc kubenswrapper[4475]: I1203 06:49:31.481798 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 06:49:31 crc kubenswrapper[4475]: I1203 06:49:31.522146 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 06:49:31 crc kubenswrapper[4475]: I1203 06:49:31.697801 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 06:49:31 crc kubenswrapper[4475]: I1203 06:49:31.737552 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 06:49:31 crc kubenswrapper[4475]: I1203 06:49:31.891659 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 06:49:31 crc kubenswrapper[4475]: I1203 06:49:31.891864 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 06:49:31 crc kubenswrapper[4475]: I1203 06:49:31.984069 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 06:49:32 crc kubenswrapper[4475]: I1203 06:49:32.031634 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 06:49:32 crc kubenswrapper[4475]: I1203 06:49:32.074542 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 06:49:32 crc kubenswrapper[4475]: I1203 06:49:32.211168 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 06:49:32 crc kubenswrapper[4475]: I1203 06:49:32.345731 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 06:49:32 crc kubenswrapper[4475]: I1203 06:49:32.553794 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 06:49:32 crc kubenswrapper[4475]: I1203 06:49:32.606817 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 06:49:32 crc kubenswrapper[4475]: I1203 06:49:32.807575 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 06:49:32 crc kubenswrapper[4475]: I1203 06:49:32.861035 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 06:49:33 crc kubenswrapper[4475]: I1203 06:49:33.088817 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 06:49:33 crc kubenswrapper[4475]: I1203 06:49:33.157538 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 06:49:33 crc kubenswrapper[4475]: I1203 06:49:33.175230 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 06:49:33 crc kubenswrapper[4475]: I1203 06:49:33.400281 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 06:49:33 crc kubenswrapper[4475]: I1203 06:49:33.480532 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 06:49:33 crc kubenswrapper[4475]: I1203 06:49:33.650415 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 06:49:33 crc kubenswrapper[4475]: I1203 06:49:33.686824 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 06:49:33 crc kubenswrapper[4475]: I1203 06:49:33.726892 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 06:49:33 crc kubenswrapper[4475]: I1203 06:49:33.819423 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 06:49:33 crc kubenswrapper[4475]: I1203 06:49:33.854493 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 06:49:34 crc kubenswrapper[4475]: I1203 06:49:34.065784 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 06:49:34 crc kubenswrapper[4475]: I1203 06:49:34.074390 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 06:49:34 crc kubenswrapper[4475]: I1203 06:49:34.666375 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 06:49:34 crc kubenswrapper[4475]: I1203 06:49:34.669805 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 06:49:34 crc kubenswrapper[4475]: I1203 06:49:34.948974 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 06:49:35 crc kubenswrapper[4475]: I1203 06:49:35.414589 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 06:49:35 crc kubenswrapper[4475]: I1203 06:49:35.565587 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 06:49:35 crc kubenswrapper[4475]: I1203 06:49:35.799031 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 06:49:35 crc kubenswrapper[4475]: I1203 06:49:35.810654 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 06:49:35 crc kubenswrapper[4475]: I1203 06:49:35.941811 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 06:49:35 crc kubenswrapper[4475]: I1203 06:49:35.955783 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 06:49:35 crc kubenswrapper[4475]: I1203 06:49:35.956784 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 06:49:35 crc kubenswrapper[4475]: I1203 06:49:35.958789 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.087046 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.139296 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.201286 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.204942 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.262393 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.341004 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.344196 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.385695 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.459243 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.467038 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.550624 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.736941 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.873778 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 06:49:36 crc kubenswrapper[4475]: I1203 06:49:36.937168 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 06:49:37 crc kubenswrapper[4475]: I1203 06:49:37.003618 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 06:49:37 crc kubenswrapper[4475]: I1203 06:49:37.190714 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 06:49:37 crc kubenswrapper[4475]: I1203 06:49:37.219635 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 06:49:37 crc kubenswrapper[4475]: I1203 06:49:37.299767 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 06:49:37 crc kubenswrapper[4475]: I1203 06:49:37.405649 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 06:49:37 crc kubenswrapper[4475]: I1203 06:49:37.514990 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 06:49:37 crc kubenswrapper[4475]: I1203 06:49:37.641086 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 06:49:37 crc kubenswrapper[4475]: I1203 06:49:37.706432 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 06:49:37 crc kubenswrapper[4475]: I1203 06:49:37.814504 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 06:49:37 crc kubenswrapper[4475]: I1203 06:49:37.904468 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.434907 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.444272 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.512758 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.513207 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.552611 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.605046 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.623444 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.648466 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.686870 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.739024 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.774506 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.779163 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.781938 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.788577 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.857346 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.907813 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.968768 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 06:49:38 crc kubenswrapper[4475]: I1203 06:49:38.974162 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.002285 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.005498 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.129473 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.135384 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.142339 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.177778 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.375330 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.413059 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.415014 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.639845 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.737755 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.891030 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.926235 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 06:49:39 crc kubenswrapper[4475]: I1203 06:49:39.946546 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.020267 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.037931 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.114441 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.146873 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.291410 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.318398 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.345162 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.362462 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.389079 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.414334 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.420172 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.479484 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.601009 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.614904 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.623007 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.670707 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.705369 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 06:49:40 crc kubenswrapper[4475]: I1203 06:49:40.716555 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.125854 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.154704 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.212440 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.250148 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.506626 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.566819 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.585054 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.620933 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.686376 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.726658 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.779911 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.839250 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.879003 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.885571 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 06:49:41 crc kubenswrapper[4475]: I1203 06:49:41.935599 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.018028 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.031603 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.067627 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.205030 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.218819 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.315507 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.380975 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.401054 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.449835 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.477821 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.657949 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.666530 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.682997 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.685334 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.706177 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.841339 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.863065 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 06:49:42 crc kubenswrapper[4475]: I1203 06:49:42.940448 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 06:49:43 crc kubenswrapper[4475]: I1203 06:49:43.153899 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 06:49:43 crc kubenswrapper[4475]: I1203 06:49:43.286847 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 06:49:43 crc kubenswrapper[4475]: I1203 06:49:43.289951 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 06:49:43 crc kubenswrapper[4475]: I1203 06:49:43.328898 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 06:49:43 crc kubenswrapper[4475]: I1203 06:49:43.364471 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 06:49:43 crc kubenswrapper[4475]: I1203 06:49:43.577846 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 06:49:43 crc kubenswrapper[4475]: I1203 06:49:43.701685 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:49:43 crc kubenswrapper[4475]: I1203 06:49:43.899143 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 06:49:43 crc kubenswrapper[4475]: I1203 06:49:43.910921 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 06:49:43 crc kubenswrapper[4475]: I1203 06:49:43.916539 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 06:49:43 crc kubenswrapper[4475]: I1203 06:49:43.960516 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.068856 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.093097 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.135063 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.139205 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.143921 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.265719 4475 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.269159 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.269198 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.269423 4475 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.269439 4475 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57d2f580-9528-4200-b0a4-797fed1ae972" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.272416 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.281054 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.281045092 podStartE2EDuration="23.281045092s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:49:44.279270793 +0000 UTC m=+269.084169127" watchObservedRunningTime="2025-12-03 06:49:44.281045092 +0000 UTC m=+269.085943426" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.337770 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.338213 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.413125 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.505334 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.508660 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.766605 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.811105 4475 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.972056 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 06:49:44 crc kubenswrapper[4475]: I1203 06:49:44.986435 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.005264 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.005851 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.008154 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.044147 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.199938 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.311826 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.325736 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.360219 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.372699 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.545137 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.623341 4475 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.859097 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.897136 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 06:49:45 crc kubenswrapper[4475]: I1203 06:49:45.913437 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.027678 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.036748 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.102245 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.213489 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.497096 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.584952 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.592938 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.608489 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.733340 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.839294 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.912940 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 06:49:46 crc kubenswrapper[4475]: I1203 06:49:46.916668 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.093466 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.182964 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.199016 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.445187 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.468994 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.491875 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.492077 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.524841 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.543748 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.693303 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.725406 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 06:49:47 crc kubenswrapper[4475]: I1203 06:49:47.880231 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.021331 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.021969 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.093855 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.271388 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.300937 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.385803 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.400687 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.464640 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.496630 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.499283 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.734173 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.939752 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 06:49:48 crc kubenswrapper[4475]: I1203 06:49:48.989271 4475 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 06:49:49 crc kubenswrapper[4475]: I1203 06:49:49.147786 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 06:49:49 crc kubenswrapper[4475]: I1203 06:49:49.183019 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 06:49:49 crc kubenswrapper[4475]: I1203 06:49:49.196515 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 06:49:49 crc kubenswrapper[4475]: I1203 06:49:49.227797 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 06:49:49 crc kubenswrapper[4475]: I1203 06:49:49.816858 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 06:49:49 crc kubenswrapper[4475]: I1203 06:49:49.868153 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 06:49:49 crc kubenswrapper[4475]: I1203 06:49:49.891098 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 06:49:50 crc kubenswrapper[4475]: I1203 06:49:50.031084 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 06:49:50 crc kubenswrapper[4475]: I1203 06:49:50.048800 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 06:49:50 crc kubenswrapper[4475]: I1203 06:49:50.219930 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 06:49:51 crc kubenswrapper[4475]: I1203 06:49:51.181179 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 06:49:51 crc kubenswrapper[4475]: I1203 06:49:51.732353 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 06:49:51 crc kubenswrapper[4475]: I1203 06:49:51.782767 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 06:49:51 crc kubenswrapper[4475]: I1203 06:49:51.830106 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 06:49:52 crc kubenswrapper[4475]: I1203 06:49:52.207768 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 06:49:54 crc kubenswrapper[4475]: I1203 06:49:54.181583 4475 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:49:54 crc kubenswrapper[4475]: I1203 06:49:54.181934 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://25b786b283d45273e7bef30e6323acbe75160cb93bdc1a7be492799d461604cf" gracePeriod=5 Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.326550 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.327204 4475 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="25b786b283d45273e7bef30e6323acbe75160cb93bdc1a7be492799d461604cf" exitCode=137 Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.726580 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.726652 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.859336 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.859378 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.859406 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.859420 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.859443 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.859493 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.859495 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.859537 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.859605 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.860244 4475 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.860264 4475 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.860272 4475 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.860281 4475 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.864819 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:49:59 crc kubenswrapper[4475]: I1203 06:49:59.960635 4475 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:00 crc kubenswrapper[4475]: I1203 06:50:00.331162 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 06:50:00 crc kubenswrapper[4475]: I1203 06:50:00.331219 4475 scope.go:117] "RemoveContainer" containerID="25b786b283d45273e7bef30e6323acbe75160cb93bdc1a7be492799d461604cf" Dec 03 06:50:00 crc kubenswrapper[4475]: I1203 06:50:00.331261 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:50:01 crc kubenswrapper[4475]: I1203 06:50:01.495628 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.328731 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7kcnv"] Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.329286 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" podUID="3263d9b9-b7e8-4758-a6a0-85749e84317a" containerName="controller-manager" containerID="cri-o://8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7" gracePeriod=30 Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.341011 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp"] Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.341396 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" podUID="30615409-a282-4405-afab-4802d9c27a3a" containerName="route-controller-manager" containerID="cri-o://134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7" gracePeriod=30 Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.621757 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.654143 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.733910 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3263d9b9-b7e8-4758-a6a0-85749e84317a-serving-cert\") pod \"3263d9b9-b7e8-4758-a6a0-85749e84317a\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.733952 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-config\") pod \"3263d9b9-b7e8-4758-a6a0-85749e84317a\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.734019 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-client-ca\") pod \"3263d9b9-b7e8-4758-a6a0-85749e84317a\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.734080 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvgf8\" (UniqueName: \"kubernetes.io/projected/3263d9b9-b7e8-4758-a6a0-85749e84317a-kube-api-access-fvgf8\") pod \"3263d9b9-b7e8-4758-a6a0-85749e84317a\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.734125 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-proxy-ca-bundles\") pod \"3263d9b9-b7e8-4758-a6a0-85749e84317a\" (UID: \"3263d9b9-b7e8-4758-a6a0-85749e84317a\") " Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.734841 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3263d9b9-b7e8-4758-a6a0-85749e84317a" (UID: "3263d9b9-b7e8-4758-a6a0-85749e84317a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.734854 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-client-ca" (OuterVolumeSpecName: "client-ca") pod "3263d9b9-b7e8-4758-a6a0-85749e84317a" (UID: "3263d9b9-b7e8-4758-a6a0-85749e84317a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.734895 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-config" (OuterVolumeSpecName: "config") pod "3263d9b9-b7e8-4758-a6a0-85749e84317a" (UID: "3263d9b9-b7e8-4758-a6a0-85749e84317a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.738337 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3263d9b9-b7e8-4758-a6a0-85749e84317a-kube-api-access-fvgf8" (OuterVolumeSpecName: "kube-api-access-fvgf8") pod "3263d9b9-b7e8-4758-a6a0-85749e84317a" (UID: "3263d9b9-b7e8-4758-a6a0-85749e84317a"). InnerVolumeSpecName "kube-api-access-fvgf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.738361 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3263d9b9-b7e8-4758-a6a0-85749e84317a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3263d9b9-b7e8-4758-a6a0-85749e84317a" (UID: "3263d9b9-b7e8-4758-a6a0-85749e84317a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.835390 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30615409-a282-4405-afab-4802d9c27a3a-serving-cert\") pod \"30615409-a282-4405-afab-4802d9c27a3a\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.835424 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-client-ca\") pod \"30615409-a282-4405-afab-4802d9c27a3a\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.835517 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-config\") pod \"30615409-a282-4405-afab-4802d9c27a3a\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.835559 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6g4x\" (UniqueName: \"kubernetes.io/projected/30615409-a282-4405-afab-4802d9c27a3a-kube-api-access-h6g4x\") pod \"30615409-a282-4405-afab-4802d9c27a3a\" (UID: \"30615409-a282-4405-afab-4802d9c27a3a\") " Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.835764 4475 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.835780 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3263d9b9-b7e8-4758-a6a0-85749e84317a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.835788 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.835795 4475 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3263d9b9-b7e8-4758-a6a0-85749e84317a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.835802 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvgf8\" (UniqueName: \"kubernetes.io/projected/3263d9b9-b7e8-4758-a6a0-85749e84317a-kube-api-access-fvgf8\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.835886 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-client-ca" (OuterVolumeSpecName: "client-ca") pod "30615409-a282-4405-afab-4802d9c27a3a" (UID: "30615409-a282-4405-afab-4802d9c27a3a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.836351 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-config" (OuterVolumeSpecName: "config") pod "30615409-a282-4405-afab-4802d9c27a3a" (UID: "30615409-a282-4405-afab-4802d9c27a3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.838273 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30615409-a282-4405-afab-4802d9c27a3a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "30615409-a282-4405-afab-4802d9c27a3a" (UID: "30615409-a282-4405-afab-4802d9c27a3a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.838279 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30615409-a282-4405-afab-4802d9c27a3a-kube-api-access-h6g4x" (OuterVolumeSpecName: "kube-api-access-h6g4x") pod "30615409-a282-4405-afab-4802d9c27a3a" (UID: "30615409-a282-4405-afab-4802d9c27a3a"). InnerVolumeSpecName "kube-api-access-h6g4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.937271 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30615409-a282-4405-afab-4802d9c27a3a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.937300 4475 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.937310 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30615409-a282-4405-afab-4802d9c27a3a-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:07 crc kubenswrapper[4475]: I1203 06:50:07.937320 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6g4x\" (UniqueName: \"kubernetes.io/projected/30615409-a282-4405-afab-4802d9c27a3a-kube-api-access-h6g4x\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.359509 4475 generic.go:334] "Generic (PLEG): container finished" podID="3263d9b9-b7e8-4758-a6a0-85749e84317a" containerID="8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7" exitCode=0 Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.359582 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.360199 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" event={"ID":"3263d9b9-b7e8-4758-a6a0-85749e84317a","Type":"ContainerDied","Data":"8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7"} Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.360237 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7kcnv" event={"ID":"3263d9b9-b7e8-4758-a6a0-85749e84317a","Type":"ContainerDied","Data":"75518b836b3f0c095d60b2d9b2ceb070eec566ed5f4a41c5f46f1cea0043159e"} Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.360253 4475 scope.go:117] "RemoveContainer" containerID="8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.361867 4475 generic.go:334] "Generic (PLEG): container finished" podID="30615409-a282-4405-afab-4802d9c27a3a" containerID="134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7" exitCode=0 Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.361948 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" event={"ID":"30615409-a282-4405-afab-4802d9c27a3a","Type":"ContainerDied","Data":"134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7"} Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.362022 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" event={"ID":"30615409-a282-4405-afab-4802d9c27a3a","Type":"ContainerDied","Data":"b6b4d8d85dc53a8ce1100b0f3116b2a699fbfdb38cc1793b0ab6ea1706d2ff62"} Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.362106 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.371109 4475 scope.go:117] "RemoveContainer" containerID="8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7" Dec 03 06:50:08 crc kubenswrapper[4475]: E1203 06:50:08.371570 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7\": container with ID starting with 8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7 not found: ID does not exist" containerID="8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.371675 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7"} err="failed to get container status \"8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7\": rpc error: code = NotFound desc = could not find container \"8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7\": container with ID starting with 8cb7909070a58477b0f2d4a663a9e063f9a3f518c1e913b9d922836d450c50f7 not found: ID does not exist" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.371761 4475 scope.go:117] "RemoveContainer" containerID="134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.382239 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7kcnv"] Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.382830 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7kcnv"] Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.386117 4475 scope.go:117] "RemoveContainer" containerID="134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7" Dec 03 06:50:08 crc kubenswrapper[4475]: E1203 06:50:08.386429 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7\": container with ID starting with 134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7 not found: ID does not exist" containerID="134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.386466 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7"} err="failed to get container status \"134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7\": rpc error: code = NotFound desc = could not find container \"134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7\": container with ID starting with 134729a711b54e662592ffc699417377c72db1ec72e91b49a3cb56219c1b0fa7 not found: ID does not exist" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.387244 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp"] Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.389875 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dlggp"] Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.715752 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-2dv7m"] Dec 03 06:50:08 crc kubenswrapper[4475]: E1203 06:50:08.716199 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30615409-a282-4405-afab-4802d9c27a3a" containerName="route-controller-manager" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.716306 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="30615409-a282-4405-afab-4802d9c27a3a" containerName="route-controller-manager" Dec 03 06:50:08 crc kubenswrapper[4475]: E1203 06:50:08.716398 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3263d9b9-b7e8-4758-a6a0-85749e84317a" containerName="controller-manager" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.716480 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="3263d9b9-b7e8-4758-a6a0-85749e84317a" containerName="controller-manager" Dec 03 06:50:08 crc kubenswrapper[4475]: E1203 06:50:08.716565 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.716618 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 06:50:08 crc kubenswrapper[4475]: E1203 06:50:08.716697 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333f027c-627f-4107-93a2-f522a583a5ed" containerName="installer" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.716773 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="333f027c-627f-4107-93a2-f522a583a5ed" containerName="installer" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.716921 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="333f027c-627f-4107-93a2-f522a583a5ed" containerName="installer" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.717001 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="30615409-a282-4405-afab-4802d9c27a3a" containerName="route-controller-manager" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.717066 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="3263d9b9-b7e8-4758-a6a0-85749e84317a" containerName="controller-manager" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.717138 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.717507 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.717866 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d"] Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.718417 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.719244 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.719404 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.720504 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.720770 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.721097 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.721597 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.721956 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.722123 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.722222 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.722722 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.723211 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.723235 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.727079 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d"] Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.729305 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-2dv7m"] Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.731236 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.845763 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j5xc\" (UniqueName: \"kubernetes.io/projected/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-kube-api-access-7j5xc\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.845803 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-config\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.845825 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-client-ca\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.845844 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba0fab74-0340-4465-bbe1-66169f3c2363-serving-cert\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.846012 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-serving-cert\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.846131 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-client-ca\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.846167 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-config\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.846183 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-proxy-ca-bundles\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.846199 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crlff\" (UniqueName: \"kubernetes.io/projected/ba0fab74-0340-4465-bbe1-66169f3c2363-kube-api-access-crlff\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.947181 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-config\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.947231 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-client-ca\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.947253 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba0fab74-0340-4465-bbe1-66169f3c2363-serving-cert\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.947275 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-serving-cert\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.947307 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-client-ca\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.947326 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-config\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.947339 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-proxy-ca-bundles\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.947354 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crlff\" (UniqueName: \"kubernetes.io/projected/ba0fab74-0340-4465-bbe1-66169f3c2363-kube-api-access-crlff\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.947384 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j5xc\" (UniqueName: \"kubernetes.io/projected/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-kube-api-access-7j5xc\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.948033 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-client-ca\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.948055 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-client-ca\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.948340 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-proxy-ca-bundles\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.948669 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-config\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.948990 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-config\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.949980 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba0fab74-0340-4465-bbe1-66169f3c2363-serving-cert\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.950761 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-serving-cert\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.959768 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j5xc\" (UniqueName: \"kubernetes.io/projected/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-kube-api-access-7j5xc\") pod \"route-controller-manager-d6c996dd6-4xw9d\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:08 crc kubenswrapper[4475]: I1203 06:50:08.962115 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crlff\" (UniqueName: \"kubernetes.io/projected/ba0fab74-0340-4465-bbe1-66169f3c2363-kube-api-access-crlff\") pod \"controller-manager-77dd5df966-2dv7m\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:09 crc kubenswrapper[4475]: I1203 06:50:09.029789 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:09 crc kubenswrapper[4475]: I1203 06:50:09.034658 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:09 crc kubenswrapper[4475]: I1203 06:50:09.364213 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-2dv7m"] Dec 03 06:50:09 crc kubenswrapper[4475]: I1203 06:50:09.388696 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d"] Dec 03 06:50:09 crc kubenswrapper[4475]: W1203 06:50:09.402641 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod124cdb66_c71e_4efb_b1bc_6aa1978cdadf.slice/crio-cc0874fad8c0fdc5fdbaa1710cabf187305992e0d0560a884f5d0cd9e8276c98 WatchSource:0}: Error finding container cc0874fad8c0fdc5fdbaa1710cabf187305992e0d0560a884f5d0cd9e8276c98: Status 404 returned error can't find the container with id cc0874fad8c0fdc5fdbaa1710cabf187305992e0d0560a884f5d0cd9e8276c98 Dec 03 06:50:09 crc kubenswrapper[4475]: I1203 06:50:09.498646 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30615409-a282-4405-afab-4802d9c27a3a" path="/var/lib/kubelet/pods/30615409-a282-4405-afab-4802d9c27a3a/volumes" Dec 03 06:50:09 crc kubenswrapper[4475]: I1203 06:50:09.499375 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3263d9b9-b7e8-4758-a6a0-85749e84317a" path="/var/lib/kubelet/pods/3263d9b9-b7e8-4758-a6a0-85749e84317a/volumes" Dec 03 06:50:10 crc kubenswrapper[4475]: I1203 06:50:10.374433 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" event={"ID":"ba0fab74-0340-4465-bbe1-66169f3c2363","Type":"ContainerStarted","Data":"a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86"} Dec 03 06:50:10 crc kubenswrapper[4475]: I1203 06:50:10.374487 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" event={"ID":"ba0fab74-0340-4465-bbe1-66169f3c2363","Type":"ContainerStarted","Data":"fe19d6bf5a8eb72b0a9057599be0047551b965cbd133f55638c95e61acc434b6"} Dec 03 06:50:10 crc kubenswrapper[4475]: I1203 06:50:10.374639 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:10 crc kubenswrapper[4475]: I1203 06:50:10.376552 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" event={"ID":"124cdb66-c71e-4efb-b1bc-6aa1978cdadf","Type":"ContainerStarted","Data":"1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387"} Dec 03 06:50:10 crc kubenswrapper[4475]: I1203 06:50:10.376754 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:10 crc kubenswrapper[4475]: I1203 06:50:10.376764 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" event={"ID":"124cdb66-c71e-4efb-b1bc-6aa1978cdadf","Type":"ContainerStarted","Data":"cc0874fad8c0fdc5fdbaa1710cabf187305992e0d0560a884f5d0cd9e8276c98"} Dec 03 06:50:10 crc kubenswrapper[4475]: I1203 06:50:10.378026 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:10 crc kubenswrapper[4475]: I1203 06:50:10.380806 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:10 crc kubenswrapper[4475]: I1203 06:50:10.388932 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" podStartSLOduration=3.388919492 podStartE2EDuration="3.388919492s" podCreationTimestamp="2025-12-03 06:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:10.385732863 +0000 UTC m=+295.190631197" watchObservedRunningTime="2025-12-03 06:50:10.388919492 +0000 UTC m=+295.193817826" Dec 03 06:50:10 crc kubenswrapper[4475]: I1203 06:50:10.415157 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" podStartSLOduration=3.415140881 podStartE2EDuration="3.415140881s" podCreationTimestamp="2025-12-03 06:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:10.414018506 +0000 UTC m=+295.218916840" watchObservedRunningTime="2025-12-03 06:50:10.415140881 +0000 UTC m=+295.220039215" Dec 03 06:50:11 crc kubenswrapper[4475]: I1203 06:50:11.579206 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-2dv7m"] Dec 03 06:50:11 crc kubenswrapper[4475]: I1203 06:50:11.591860 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d"] Dec 03 06:50:13 crc kubenswrapper[4475]: I1203 06:50:13.386747 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" podUID="ba0fab74-0340-4465-bbe1-66169f3c2363" containerName="controller-manager" containerID="cri-o://a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86" gracePeriod=30 Dec 03 06:50:13 crc kubenswrapper[4475]: I1203 06:50:13.386840 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" podUID="124cdb66-c71e-4efb-b1bc-6aa1978cdadf" containerName="route-controller-manager" containerID="cri-o://1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387" gracePeriod=30 Dec 03 06:50:13 crc kubenswrapper[4475]: I1203 06:50:13.732196 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.783307 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p"] Dec 03 06:50:14 crc kubenswrapper[4475]: E1203 06:50:13.783517 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124cdb66-c71e-4efb-b1bc-6aa1978cdadf" containerName="route-controller-manager" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.783529 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="124cdb66-c71e-4efb-b1bc-6aa1978cdadf" containerName="route-controller-manager" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.783649 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="124cdb66-c71e-4efb-b1bc-6aa1978cdadf" containerName="route-controller-manager" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.783994 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.789577 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.797561 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p"] Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.899923 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-config\") pod \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900002 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j5xc\" (UniqueName: \"kubernetes.io/projected/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-kube-api-access-7j5xc\") pod \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900027 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-client-ca\") pod \"ba0fab74-0340-4465-bbe1-66169f3c2363\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900049 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crlff\" (UniqueName: \"kubernetes.io/projected/ba0fab74-0340-4465-bbe1-66169f3c2363-kube-api-access-crlff\") pod \"ba0fab74-0340-4465-bbe1-66169f3c2363\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900089 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba0fab74-0340-4465-bbe1-66169f3c2363-serving-cert\") pod \"ba0fab74-0340-4465-bbe1-66169f3c2363\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900124 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-config\") pod \"ba0fab74-0340-4465-bbe1-66169f3c2363\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900156 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-proxy-ca-bundles\") pod \"ba0fab74-0340-4465-bbe1-66169f3c2363\" (UID: \"ba0fab74-0340-4465-bbe1-66169f3c2363\") " Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900172 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-serving-cert\") pod \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900200 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-client-ca\") pod \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\" (UID: \"124cdb66-c71e-4efb-b1bc-6aa1978cdadf\") " Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900316 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-config\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900339 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8drl\" (UniqueName: \"kubernetes.io/projected/8f3c7792-82c8-436a-87fa-31c713406593-kube-api-access-n8drl\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900385 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3c7792-82c8-436a-87fa-31c713406593-serving-cert\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.900408 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-client-ca\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.901012 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-config" (OuterVolumeSpecName: "config") pod "124cdb66-c71e-4efb-b1bc-6aa1978cdadf" (UID: "124cdb66-c71e-4efb-b1bc-6aa1978cdadf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.901330 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ba0fab74-0340-4465-bbe1-66169f3c2363" (UID: "ba0fab74-0340-4465-bbe1-66169f3c2363"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.901557 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-client-ca" (OuterVolumeSpecName: "client-ca") pod "ba0fab74-0340-4465-bbe1-66169f3c2363" (UID: "ba0fab74-0340-4465-bbe1-66169f3c2363"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.902330 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-config" (OuterVolumeSpecName: "config") pod "ba0fab74-0340-4465-bbe1-66169f3c2363" (UID: "ba0fab74-0340-4465-bbe1-66169f3c2363"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.903164 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-client-ca" (OuterVolumeSpecName: "client-ca") pod "124cdb66-c71e-4efb-b1bc-6aa1978cdadf" (UID: "124cdb66-c71e-4efb-b1bc-6aa1978cdadf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.907999 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-kube-api-access-7j5xc" (OuterVolumeSpecName: "kube-api-access-7j5xc") pod "124cdb66-c71e-4efb-b1bc-6aa1978cdadf" (UID: "124cdb66-c71e-4efb-b1bc-6aa1978cdadf"). InnerVolumeSpecName "kube-api-access-7j5xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.908561 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "124cdb66-c71e-4efb-b1bc-6aa1978cdadf" (UID: "124cdb66-c71e-4efb-b1bc-6aa1978cdadf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.908572 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0fab74-0340-4465-bbe1-66169f3c2363-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ba0fab74-0340-4465-bbe1-66169f3c2363" (UID: "ba0fab74-0340-4465-bbe1-66169f3c2363"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:13.908585 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba0fab74-0340-4465-bbe1-66169f3c2363-kube-api-access-crlff" (OuterVolumeSpecName: "kube-api-access-crlff") pod "ba0fab74-0340-4465-bbe1-66169f3c2363" (UID: "ba0fab74-0340-4465-bbe1-66169f3c2363"). InnerVolumeSpecName "kube-api-access-crlff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.001689 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3c7792-82c8-436a-87fa-31c713406593-serving-cert\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002062 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-client-ca\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002107 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-config\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002130 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8drl\" (UniqueName: \"kubernetes.io/projected/8f3c7792-82c8-436a-87fa-31c713406593-kube-api-access-n8drl\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002198 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j5xc\" (UniqueName: \"kubernetes.io/projected/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-kube-api-access-7j5xc\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002209 4475 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002219 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crlff\" (UniqueName: \"kubernetes.io/projected/ba0fab74-0340-4465-bbe1-66169f3c2363-kube-api-access-crlff\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002226 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba0fab74-0340-4465-bbe1-66169f3c2363-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002235 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002243 4475 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba0fab74-0340-4465-bbe1-66169f3c2363-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002251 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002258 4475 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002265 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124cdb66-c71e-4efb-b1bc-6aa1978cdadf-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.002499 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-client-ca\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.003196 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-config\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.013607 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3c7792-82c8-436a-87fa-31c713406593-serving-cert\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.017495 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8drl\" (UniqueName: \"kubernetes.io/projected/8f3c7792-82c8-436a-87fa-31c713406593-kube-api-access-n8drl\") pod \"route-controller-manager-56c6954dc9-2mg6p\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.100985 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.391189 4475 generic.go:334] "Generic (PLEG): container finished" podID="ba0fab74-0340-4465-bbe1-66169f3c2363" containerID="a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86" exitCode=0 Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.391425 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" event={"ID":"ba0fab74-0340-4465-bbe1-66169f3c2363","Type":"ContainerDied","Data":"a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86"} Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.391481 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" event={"ID":"ba0fab74-0340-4465-bbe1-66169f3c2363","Type":"ContainerDied","Data":"fe19d6bf5a8eb72b0a9057599be0047551b965cbd133f55638c95e61acc434b6"} Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.391500 4475 scope.go:117] "RemoveContainer" containerID="a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.391609 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dd5df966-2dv7m" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.395852 4475 generic.go:334] "Generic (PLEG): container finished" podID="124cdb66-c71e-4efb-b1bc-6aa1978cdadf" containerID="1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387" exitCode=0 Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.395887 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" event={"ID":"124cdb66-c71e-4efb-b1bc-6aa1978cdadf","Type":"ContainerDied","Data":"1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387"} Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.395910 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" event={"ID":"124cdb66-c71e-4efb-b1bc-6aa1978cdadf","Type":"ContainerDied","Data":"cc0874fad8c0fdc5fdbaa1710cabf187305992e0d0560a884f5d0cd9e8276c98"} Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.395961 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.407130 4475 scope.go:117] "RemoveContainer" containerID="a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86" Dec 03 06:50:14 crc kubenswrapper[4475]: E1203 06:50:14.407426 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86\": container with ID starting with a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86 not found: ID does not exist" containerID="a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.407481 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86"} err="failed to get container status \"a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86\": rpc error: code = NotFound desc = could not find container \"a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86\": container with ID starting with a33c47b313f0fbfc9bcab9b08fbf5b41b175742e9cd21c5053f7d8ea6dd88f86 not found: ID does not exist" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.407506 4475 scope.go:117] "RemoveContainer" containerID="1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.416154 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-2dv7m"] Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.419850 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-2dv7m"] Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.422766 4475 scope.go:117] "RemoveContainer" containerID="1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387" Dec 03 06:50:14 crc kubenswrapper[4475]: E1203 06:50:14.423301 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387\": container with ID starting with 1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387 not found: ID does not exist" containerID="1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.423327 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387"} err="failed to get container status \"1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387\": rpc error: code = NotFound desc = could not find container \"1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387\": container with ID starting with 1d1a4232baa78159d42beeecfa595674d63a408aea760884d0616620f3c86387 not found: ID does not exist" Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.434403 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d"] Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.436799 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6c996dd6-4xw9d"] Dec 03 06:50:14 crc kubenswrapper[4475]: I1203 06:50:14.438999 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p"] Dec 03 06:50:15 crc kubenswrapper[4475]: I1203 06:50:15.402067 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" event={"ID":"8f3c7792-82c8-436a-87fa-31c713406593","Type":"ContainerStarted","Data":"188edb471d7b8a65132f34e73b47013174671d4c17b9a3c630b8b1e95ed04dc1"} Dec 03 06:50:15 crc kubenswrapper[4475]: I1203 06:50:15.402249 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" event={"ID":"8f3c7792-82c8-436a-87fa-31c713406593","Type":"ContainerStarted","Data":"448d1e1f5dc382ee71d295f47ede606d6a7d604262420939c86a3be7a0f0f684"} Dec 03 06:50:15 crc kubenswrapper[4475]: I1203 06:50:15.402527 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:15 crc kubenswrapper[4475]: I1203 06:50:15.407547 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:15 crc kubenswrapper[4475]: I1203 06:50:15.436803 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" podStartSLOduration=4.436770619 podStartE2EDuration="4.436770619s" podCreationTimestamp="2025-12-03 06:50:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:15.416845791 +0000 UTC m=+300.221744126" watchObservedRunningTime="2025-12-03 06:50:15.436770619 +0000 UTC m=+300.241668954" Dec 03 06:50:15 crc kubenswrapper[4475]: I1203 06:50:15.515860 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124cdb66-c71e-4efb-b1bc-6aa1978cdadf" path="/var/lib/kubelet/pods/124cdb66-c71e-4efb-b1bc-6aa1978cdadf/volumes" Dec 03 06:50:15 crc kubenswrapper[4475]: I1203 06:50:15.516643 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba0fab74-0340-4465-bbe1-66169f3c2363" path="/var/lib/kubelet/pods/ba0fab74-0340-4465-bbe1-66169f3c2363/volumes" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.718997 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-cd685"] Dec 03 06:50:16 crc kubenswrapper[4475]: E1203 06:50:16.719320 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba0fab74-0340-4465-bbe1-66169f3c2363" containerName="controller-manager" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.719331 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0fab74-0340-4465-bbe1-66169f3c2363" containerName="controller-manager" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.719402 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba0fab74-0340-4465-bbe1-66169f3c2363" containerName="controller-manager" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.719763 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.722648 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.725229 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.725553 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.725688 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.725978 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.726065 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-cd685"] Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.726149 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.733218 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.832508 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6j86\" (UniqueName: \"kubernetes.io/projected/2d978508-2471-43a7-970a-cd1d28247f6d-kube-api-access-v6j86\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.832552 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-config\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.832633 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-proxy-ca-bundles\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.832700 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d978508-2471-43a7-970a-cd1d28247f6d-serving-cert\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.832777 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-client-ca\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.933742 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-proxy-ca-bundles\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.933808 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d978508-2471-43a7-970a-cd1d28247f6d-serving-cert\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.933866 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-client-ca\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.934023 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6j86\" (UniqueName: \"kubernetes.io/projected/2d978508-2471-43a7-970a-cd1d28247f6d-kube-api-access-v6j86\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.934059 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-config\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.934704 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-client-ca\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.934741 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-proxy-ca-bundles\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.935284 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-config\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.943005 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d978508-2471-43a7-970a-cd1d28247f6d-serving-cert\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:16 crc kubenswrapper[4475]: I1203 06:50:16.947001 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6j86\" (UniqueName: \"kubernetes.io/projected/2d978508-2471-43a7-970a-cd1d28247f6d-kube-api-access-v6j86\") pod \"controller-manager-6889d7b855-cd685\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:17 crc kubenswrapper[4475]: I1203 06:50:17.035820 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:17 crc kubenswrapper[4475]: I1203 06:50:17.379528 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-cd685"] Dec 03 06:50:17 crc kubenswrapper[4475]: W1203 06:50:17.387138 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d978508_2471_43a7_970a_cd1d28247f6d.slice/crio-c7ac52ef513c522e63ab5459e853526df001f64f8e7daff9edcb8e6cdf72e3c0 WatchSource:0}: Error finding container c7ac52ef513c522e63ab5459e853526df001f64f8e7daff9edcb8e6cdf72e3c0: Status 404 returned error can't find the container with id c7ac52ef513c522e63ab5459e853526df001f64f8e7daff9edcb8e6cdf72e3c0 Dec 03 06:50:17 crc kubenswrapper[4475]: I1203 06:50:17.411261 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" event={"ID":"2d978508-2471-43a7-970a-cd1d28247f6d","Type":"ContainerStarted","Data":"c7ac52ef513c522e63ab5459e853526df001f64f8e7daff9edcb8e6cdf72e3c0"} Dec 03 06:50:18 crc kubenswrapper[4475]: I1203 06:50:18.423704 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" event={"ID":"2d978508-2471-43a7-970a-cd1d28247f6d","Type":"ContainerStarted","Data":"1974612b37831790876a76c3fad8c5e6bf561b921fa62b8dc7253a9016aa92c5"} Dec 03 06:50:18 crc kubenswrapper[4475]: I1203 06:50:18.424075 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:18 crc kubenswrapper[4475]: I1203 06:50:18.428903 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:18 crc kubenswrapper[4475]: I1203 06:50:18.443668 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" podStartSLOduration=7.443659443 podStartE2EDuration="7.443659443s" podCreationTimestamp="2025-12-03 06:50:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:18.442590418 +0000 UTC m=+303.247488762" watchObservedRunningTime="2025-12-03 06:50:18.443659443 +0000 UTC m=+303.248557777" Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.322156 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-cd685"] Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.323031 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" podUID="2d978508-2471-43a7-970a-cd1d28247f6d" containerName="controller-manager" containerID="cri-o://1974612b37831790876a76c3fad8c5e6bf561b921fa62b8dc7253a9016aa92c5" gracePeriod=30 Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.474476 4475 generic.go:334] "Generic (PLEG): container finished" podID="2d978508-2471-43a7-970a-cd1d28247f6d" containerID="1974612b37831790876a76c3fad8c5e6bf561b921fa62b8dc7253a9016aa92c5" exitCode=0 Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.474518 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" event={"ID":"2d978508-2471-43a7-970a-cd1d28247f6d","Type":"ContainerDied","Data":"1974612b37831790876a76c3fad8c5e6bf561b921fa62b8dc7253a9016aa92c5"} Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.765499 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.876949 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-config\") pod \"2d978508-2471-43a7-970a-cd1d28247f6d\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.876989 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-proxy-ca-bundles\") pod \"2d978508-2471-43a7-970a-cd1d28247f6d\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.877087 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6j86\" (UniqueName: \"kubernetes.io/projected/2d978508-2471-43a7-970a-cd1d28247f6d-kube-api-access-v6j86\") pod \"2d978508-2471-43a7-970a-cd1d28247f6d\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.877784 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2d978508-2471-43a7-970a-cd1d28247f6d" (UID: "2d978508-2471-43a7-970a-cd1d28247f6d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.877852 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-config" (OuterVolumeSpecName: "config") pod "2d978508-2471-43a7-970a-cd1d28247f6d" (UID: "2d978508-2471-43a7-970a-cd1d28247f6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.878185 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d978508-2471-43a7-970a-cd1d28247f6d-serving-cert\") pod \"2d978508-2471-43a7-970a-cd1d28247f6d\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.878236 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-client-ca\") pod \"2d978508-2471-43a7-970a-cd1d28247f6d\" (UID: \"2d978508-2471-43a7-970a-cd1d28247f6d\") " Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.878469 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.878481 4475 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.878666 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-client-ca" (OuterVolumeSpecName: "client-ca") pod "2d978508-2471-43a7-970a-cd1d28247f6d" (UID: "2d978508-2471-43a7-970a-cd1d28247f6d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.883363 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d978508-2471-43a7-970a-cd1d28247f6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2d978508-2471-43a7-970a-cd1d28247f6d" (UID: "2d978508-2471-43a7-970a-cd1d28247f6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.884222 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d978508-2471-43a7-970a-cd1d28247f6d-kube-api-access-v6j86" (OuterVolumeSpecName: "kube-api-access-v6j86") pod "2d978508-2471-43a7-970a-cd1d28247f6d" (UID: "2d978508-2471-43a7-970a-cd1d28247f6d"). InnerVolumeSpecName "kube-api-access-v6j86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.979531 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6j86\" (UniqueName: \"kubernetes.io/projected/2d978508-2471-43a7-970a-cd1d28247f6d-kube-api-access-v6j86\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.979560 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d978508-2471-43a7-970a-cd1d28247f6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:27 crc kubenswrapper[4475]: I1203 06:50:27.979572 4475 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d978508-2471-43a7-970a-cd1d28247f6d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.489287 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" event={"ID":"2d978508-2471-43a7-970a-cd1d28247f6d","Type":"ContainerDied","Data":"c7ac52ef513c522e63ab5459e853526df001f64f8e7daff9edcb8e6cdf72e3c0"} Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.489359 4475 scope.go:117] "RemoveContainer" containerID="1974612b37831790876a76c3fad8c5e6bf561b921fa62b8dc7253a9016aa92c5" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.489440 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6889d7b855-cd685" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.517193 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-cd685"] Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.522055 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-cd685"] Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.730043 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c745cfb66-j2lbc"] Dec 03 06:50:28 crc kubenswrapper[4475]: E1203 06:50:28.730336 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d978508-2471-43a7-970a-cd1d28247f6d" containerName="controller-manager" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.730355 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d978508-2471-43a7-970a-cd1d28247f6d" containerName="controller-manager" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.730495 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d978508-2471-43a7-970a-cd1d28247f6d" containerName="controller-manager" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.730988 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.735791 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.736233 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.736400 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.736762 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.737033 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.737134 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.737266 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c745cfb66-j2lbc"] Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.740439 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.791178 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcfabff-c5ef-4146-9815-4562348e5afd-config\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.791588 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm82n\" (UniqueName: \"kubernetes.io/projected/cfcfabff-c5ef-4146-9815-4562348e5afd-kube-api-access-vm82n\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.791687 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfcfabff-c5ef-4146-9815-4562348e5afd-client-ca\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.791762 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfcfabff-c5ef-4146-9815-4562348e5afd-serving-cert\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.791799 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfcfabff-c5ef-4146-9815-4562348e5afd-proxy-ca-bundles\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.892815 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfcfabff-c5ef-4146-9815-4562348e5afd-client-ca\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.892857 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfcfabff-c5ef-4146-9815-4562348e5afd-serving-cert\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.892875 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfcfabff-c5ef-4146-9815-4562348e5afd-proxy-ca-bundles\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.892937 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcfabff-c5ef-4146-9815-4562348e5afd-config\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.892992 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm82n\" (UniqueName: \"kubernetes.io/projected/cfcfabff-c5ef-4146-9815-4562348e5afd-kube-api-access-vm82n\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.894092 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfcfabff-c5ef-4146-9815-4562348e5afd-client-ca\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.894258 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfcfabff-c5ef-4146-9815-4562348e5afd-proxy-ca-bundles\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.894514 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcfabff-c5ef-4146-9815-4562348e5afd-config\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.899616 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfcfabff-c5ef-4146-9815-4562348e5afd-serving-cert\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:28 crc kubenswrapper[4475]: I1203 06:50:28.907033 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm82n\" (UniqueName: \"kubernetes.io/projected/cfcfabff-c5ef-4146-9815-4562348e5afd-kube-api-access-vm82n\") pod \"controller-manager-7c745cfb66-j2lbc\" (UID: \"cfcfabff-c5ef-4146-9815-4562348e5afd\") " pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:29 crc kubenswrapper[4475]: I1203 06:50:29.044404 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:29 crc kubenswrapper[4475]: I1203 06:50:29.384056 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c745cfb66-j2lbc"] Dec 03 06:50:29 crc kubenswrapper[4475]: I1203 06:50:29.497109 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d978508-2471-43a7-970a-cd1d28247f6d" path="/var/lib/kubelet/pods/2d978508-2471-43a7-970a-cd1d28247f6d/volumes" Dec 03 06:50:29 crc kubenswrapper[4475]: I1203 06:50:29.499634 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" event={"ID":"cfcfabff-c5ef-4146-9815-4562348e5afd","Type":"ContainerStarted","Data":"e590d8dffc831bff120f0e0a1eca2d22fe2cf7694e0a9578a9e853a791cbb78b"} Dec 03 06:50:29 crc kubenswrapper[4475]: I1203 06:50:29.499689 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:29 crc kubenswrapper[4475]: I1203 06:50:29.499701 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" event={"ID":"cfcfabff-c5ef-4146-9815-4562348e5afd","Type":"ContainerStarted","Data":"f28f98163fe35c024a65624bf485c6af780c36aae02c6ad30f85d17a8cd7c81d"} Dec 03 06:50:29 crc kubenswrapper[4475]: I1203 06:50:29.503167 4475 patch_prober.go:28] interesting pod/controller-manager-7c745cfb66-j2lbc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Dec 03 06:50:29 crc kubenswrapper[4475]: I1203 06:50:29.503235 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" podUID="cfcfabff-c5ef-4146-9815-4562348e5afd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Dec 03 06:50:29 crc kubenswrapper[4475]: I1203 06:50:29.517018 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" podStartSLOduration=2.516980894 podStartE2EDuration="2.516980894s" podCreationTimestamp="2025-12-03 06:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:29.512352201 +0000 UTC m=+314.317250535" watchObservedRunningTime="2025-12-03 06:50:29.516980894 +0000 UTC m=+314.321879228" Dec 03 06:50:30 crc kubenswrapper[4475]: I1203 06:50:30.510381 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c745cfb66-j2lbc" Dec 03 06:50:34 crc kubenswrapper[4475]: I1203 06:50:34.954226 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p5kv9"] Dec 03 06:50:34 crc kubenswrapper[4475]: I1203 06:50:34.955176 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:34 crc kubenswrapper[4475]: I1203 06:50:34.964849 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p5kv9"] Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.156780 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8775c52a-119c-4f96-9d7b-1269edf83772-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.156836 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.156868 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqkpk\" (UniqueName: \"kubernetes.io/projected/8775c52a-119c-4f96-9d7b-1269edf83772-kube-api-access-pqkpk\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.156904 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8775c52a-119c-4f96-9d7b-1269edf83772-bound-sa-token\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.156944 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8775c52a-119c-4f96-9d7b-1269edf83772-registry-certificates\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.156959 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8775c52a-119c-4f96-9d7b-1269edf83772-trusted-ca\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.157082 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8775c52a-119c-4f96-9d7b-1269edf83772-registry-tls\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.157120 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8775c52a-119c-4f96-9d7b-1269edf83772-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.173255 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.258784 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8775c52a-119c-4f96-9d7b-1269edf83772-registry-certificates\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.258818 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8775c52a-119c-4f96-9d7b-1269edf83772-trusted-ca\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.259497 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8775c52a-119c-4f96-9d7b-1269edf83772-registry-tls\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.259537 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8775c52a-119c-4f96-9d7b-1269edf83772-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.259618 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8775c52a-119c-4f96-9d7b-1269edf83772-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.259665 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqkpk\" (UniqueName: \"kubernetes.io/projected/8775c52a-119c-4f96-9d7b-1269edf83772-kube-api-access-pqkpk\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.259715 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8775c52a-119c-4f96-9d7b-1269edf83772-bound-sa-token\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.259952 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8775c52a-119c-4f96-9d7b-1269edf83772-registry-certificates\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.260033 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8775c52a-119c-4f96-9d7b-1269edf83772-trusted-ca\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.260226 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8775c52a-119c-4f96-9d7b-1269edf83772-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.263558 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8775c52a-119c-4f96-9d7b-1269edf83772-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.268924 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8775c52a-119c-4f96-9d7b-1269edf83772-registry-tls\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.275836 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqkpk\" (UniqueName: \"kubernetes.io/projected/8775c52a-119c-4f96-9d7b-1269edf83772-kube-api-access-pqkpk\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.281513 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8775c52a-119c-4f96-9d7b-1269edf83772-bound-sa-token\") pod \"image-registry-66df7c8f76-p5kv9\" (UID: \"8775c52a-119c-4f96-9d7b-1269edf83772\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.567382 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:35 crc kubenswrapper[4475]: I1203 06:50:35.917443 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p5kv9"] Dec 03 06:50:35 crc kubenswrapper[4475]: W1203 06:50:35.922686 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8775c52a_119c_4f96_9d7b_1269edf83772.slice/crio-64c67cee8439b936fde7fe62e0858a996a25a0bfbf77f4132360c52355b7e0dc WatchSource:0}: Error finding container 64c67cee8439b936fde7fe62e0858a996a25a0bfbf77f4132360c52355b7e0dc: Status 404 returned error can't find the container with id 64c67cee8439b936fde7fe62e0858a996a25a0bfbf77f4132360c52355b7e0dc Dec 03 06:50:36 crc kubenswrapper[4475]: I1203 06:50:36.527943 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" event={"ID":"8775c52a-119c-4f96-9d7b-1269edf83772","Type":"ContainerStarted","Data":"16f7f253409d1c3d8592190ab72565cddc7a6b13ad4fc52bcee1ae2c5289effd"} Dec 03 06:50:36 crc kubenswrapper[4475]: I1203 06:50:36.527993 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" event={"ID":"8775c52a-119c-4f96-9d7b-1269edf83772","Type":"ContainerStarted","Data":"64c67cee8439b936fde7fe62e0858a996a25a0bfbf77f4132360c52355b7e0dc"} Dec 03 06:50:36 crc kubenswrapper[4475]: I1203 06:50:36.528019 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:36 crc kubenswrapper[4475]: I1203 06:50:36.540821 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" podStartSLOduration=2.540803747 podStartE2EDuration="2.540803747s" podCreationTimestamp="2025-12-03 06:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:36.540414427 +0000 UTC m=+321.345312761" watchObservedRunningTime="2025-12-03 06:50:36.540803747 +0000 UTC m=+321.345702081" Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.317597 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p"] Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.318089 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" podUID="8f3c7792-82c8-436a-87fa-31c713406593" containerName="route-controller-manager" containerID="cri-o://188edb471d7b8a65132f34e73b47013174671d4c17b9a3c630b8b1e95ed04dc1" gracePeriod=30 Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.572500 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f3c7792-82c8-436a-87fa-31c713406593" containerID="188edb471d7b8a65132f34e73b47013174671d4c17b9a3c630b8b1e95ed04dc1" exitCode=0 Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.572632 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" event={"ID":"8f3c7792-82c8-436a-87fa-31c713406593","Type":"ContainerDied","Data":"188edb471d7b8a65132f34e73b47013174671d4c17b9a3c630b8b1e95ed04dc1"} Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.685261 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.785246 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8drl\" (UniqueName: \"kubernetes.io/projected/8f3c7792-82c8-436a-87fa-31c713406593-kube-api-access-n8drl\") pod \"8f3c7792-82c8-436a-87fa-31c713406593\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.785307 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-config\") pod \"8f3c7792-82c8-436a-87fa-31c713406593\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.785358 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3c7792-82c8-436a-87fa-31c713406593-serving-cert\") pod \"8f3c7792-82c8-436a-87fa-31c713406593\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.785426 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-client-ca\") pod \"8f3c7792-82c8-436a-87fa-31c713406593\" (UID: \"8f3c7792-82c8-436a-87fa-31c713406593\") " Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.786020 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-config" (OuterVolumeSpecName: "config") pod "8f3c7792-82c8-436a-87fa-31c713406593" (UID: "8f3c7792-82c8-436a-87fa-31c713406593"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.786031 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-client-ca" (OuterVolumeSpecName: "client-ca") pod "8f3c7792-82c8-436a-87fa-31c713406593" (UID: "8f3c7792-82c8-436a-87fa-31c713406593"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.789791 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f3c7792-82c8-436a-87fa-31c713406593-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8f3c7792-82c8-436a-87fa-31c713406593" (UID: "8f3c7792-82c8-436a-87fa-31c713406593"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.789909 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f3c7792-82c8-436a-87fa-31c713406593-kube-api-access-n8drl" (OuterVolumeSpecName: "kube-api-access-n8drl") pod "8f3c7792-82c8-436a-87fa-31c713406593" (UID: "8f3c7792-82c8-436a-87fa-31c713406593"). InnerVolumeSpecName "kube-api-access-n8drl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.886520 4475 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.886545 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8drl\" (UniqueName: \"kubernetes.io/projected/8f3c7792-82c8-436a-87fa-31c713406593-kube-api-access-n8drl\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.886556 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3c7792-82c8-436a-87fa-31c713406593-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:47 crc kubenswrapper[4475]: I1203 06:50:47.886565 4475 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3c7792-82c8-436a-87fa-31c713406593-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.577946 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" event={"ID":"8f3c7792-82c8-436a-87fa-31c713406593","Type":"ContainerDied","Data":"448d1e1f5dc382ee71d295f47ede606d6a7d604262420939c86a3be7a0f0f684"} Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.578143 4475 scope.go:117] "RemoveContainer" containerID="188edb471d7b8a65132f34e73b47013174671d4c17b9a3c630b8b1e95ed04dc1" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.578165 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.596748 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p"] Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.599090 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56c6954dc9-2mg6p"] Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.737883 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp"] Dec 03 06:50:48 crc kubenswrapper[4475]: E1203 06:50:48.738032 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f3c7792-82c8-436a-87fa-31c713406593" containerName="route-controller-manager" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.738047 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f3c7792-82c8-436a-87fa-31c713406593" containerName="route-controller-manager" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.738139 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f3c7792-82c8-436a-87fa-31c713406593" containerName="route-controller-manager" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.738444 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.739502 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.740395 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.740418 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.740462 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.740397 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.741294 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.745886 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp"] Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.896690 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-serving-cert\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.896731 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-config\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.896769 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-client-ca\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.896786 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h8sp\" (UniqueName: \"kubernetes.io/projected/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-kube-api-access-7h8sp\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.997646 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-serving-cert\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.998224 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-config\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.998265 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-client-ca\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.998283 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h8sp\" (UniqueName: \"kubernetes.io/projected/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-kube-api-access-7h8sp\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.999156 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-client-ca\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:48 crc kubenswrapper[4475]: I1203 06:50:48.999397 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-config\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:49 crc kubenswrapper[4475]: I1203 06:50:49.001047 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-serving-cert\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:49 crc kubenswrapper[4475]: I1203 06:50:49.011027 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h8sp\" (UniqueName: \"kubernetes.io/projected/ac5fd437-d155-4c9a-840f-e9d66bf4fcc3-kube-api-access-7h8sp\") pod \"route-controller-manager-56549d5446-5b8kp\" (UID: \"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3\") " pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:49 crc kubenswrapper[4475]: I1203 06:50:49.048790 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:49 crc kubenswrapper[4475]: I1203 06:50:49.372567 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp"] Dec 03 06:50:49 crc kubenswrapper[4475]: I1203 06:50:49.495708 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f3c7792-82c8-436a-87fa-31c713406593" path="/var/lib/kubelet/pods/8f3c7792-82c8-436a-87fa-31c713406593/volumes" Dec 03 06:50:49 crc kubenswrapper[4475]: I1203 06:50:49.583669 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" event={"ID":"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3","Type":"ContainerStarted","Data":"2d535012301b85c32cb82a1d7a975310a2f768315a078ebb195aa5d79ee06414"} Dec 03 06:50:49 crc kubenswrapper[4475]: I1203 06:50:49.583698 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" event={"ID":"ac5fd437-d155-4c9a-840f-e9d66bf4fcc3","Type":"ContainerStarted","Data":"c8736b730533e0f2a929a2da2f2258ef230f7060eded0d9a37c52dbccab5c37b"} Dec 03 06:50:49 crc kubenswrapper[4475]: I1203 06:50:49.583868 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:49 crc kubenswrapper[4475]: I1203 06:50:49.671233 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" Dec 03 06:50:49 crc kubenswrapper[4475]: I1203 06:50:49.685090 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56549d5446-5b8kp" podStartSLOduration=2.685071906 podStartE2EDuration="2.685071906s" podCreationTimestamp="2025-12-03 06:50:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.600979687 +0000 UTC m=+334.405878022" watchObservedRunningTime="2025-12-03 06:50:49.685071906 +0000 UTC m=+334.489970240" Dec 03 06:50:55 crc kubenswrapper[4475]: I1203 06:50:55.572015 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p5kv9" Dec 03 06:50:55 crc kubenswrapper[4475]: I1203 06:50:55.604162 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dcjv5"] Dec 03 06:51:20 crc kubenswrapper[4475]: I1203 06:51:20.630369 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" podUID="fe214ce1-0821-4547-ac8b-e001a0579495" containerName="registry" containerID="cri-o://709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10" gracePeriod=30 Dec 03 06:51:20 crc kubenswrapper[4475]: I1203 06:51:20.929295 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.021633 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-registry-tls\") pod \"fe214ce1-0821-4547-ac8b-e001a0579495\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.021854 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"fe214ce1-0821-4547-ac8b-e001a0579495\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.021887 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-trusted-ca\") pod \"fe214ce1-0821-4547-ac8b-e001a0579495\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.021931 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-registry-certificates\") pod \"fe214ce1-0821-4547-ac8b-e001a0579495\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.021972 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe214ce1-0821-4547-ac8b-e001a0579495-ca-trust-extracted\") pod \"fe214ce1-0821-4547-ac8b-e001a0579495\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.021995 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-bound-sa-token\") pod \"fe214ce1-0821-4547-ac8b-e001a0579495\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.022031 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe214ce1-0821-4547-ac8b-e001a0579495-installation-pull-secrets\") pod \"fe214ce1-0821-4547-ac8b-e001a0579495\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.022049 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rmsw\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-kube-api-access-5rmsw\") pod \"fe214ce1-0821-4547-ac8b-e001a0579495\" (UID: \"fe214ce1-0821-4547-ac8b-e001a0579495\") " Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.022594 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fe214ce1-0821-4547-ac8b-e001a0579495" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.022697 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fe214ce1-0821-4547-ac8b-e001a0579495" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.026139 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-kube-api-access-5rmsw" (OuterVolumeSpecName: "kube-api-access-5rmsw") pod "fe214ce1-0821-4547-ac8b-e001a0579495" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495"). InnerVolumeSpecName "kube-api-access-5rmsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.026317 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fe214ce1-0821-4547-ac8b-e001a0579495" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.026489 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fe214ce1-0821-4547-ac8b-e001a0579495" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.027090 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe214ce1-0821-4547-ac8b-e001a0579495-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fe214ce1-0821-4547-ac8b-e001a0579495" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.028561 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "fe214ce1-0821-4547-ac8b-e001a0579495" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.035479 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe214ce1-0821-4547-ac8b-e001a0579495-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fe214ce1-0821-4547-ac8b-e001a0579495" (UID: "fe214ce1-0821-4547-ac8b-e001a0579495"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.123372 4475 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.123395 4475 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.123411 4475 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe214ce1-0821-4547-ac8b-e001a0579495-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.123422 4475 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe214ce1-0821-4547-ac8b-e001a0579495-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.123444 4475 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.123467 4475 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe214ce1-0821-4547-ac8b-e001a0579495-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.123476 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rmsw\" (UniqueName: \"kubernetes.io/projected/fe214ce1-0821-4547-ac8b-e001a0579495-kube-api-access-5rmsw\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.700668 4475 generic.go:334] "Generic (PLEG): container finished" podID="fe214ce1-0821-4547-ac8b-e001a0579495" containerID="709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10" exitCode=0 Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.700733 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.700749 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" event={"ID":"fe214ce1-0821-4547-ac8b-e001a0579495","Type":"ContainerDied","Data":"709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10"} Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.700961 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dcjv5" event={"ID":"fe214ce1-0821-4547-ac8b-e001a0579495","Type":"ContainerDied","Data":"04d46d465e41ea7893dce520d9627958de1d845ff76685f4558b6580f429a803"} Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.700978 4475 scope.go:117] "RemoveContainer" containerID="709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.711410 4475 scope.go:117] "RemoveContainer" containerID="709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10" Dec 03 06:51:21 crc kubenswrapper[4475]: E1203 06:51:21.711740 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10\": container with ID starting with 709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10 not found: ID does not exist" containerID="709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.711771 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10"} err="failed to get container status \"709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10\": rpc error: code = NotFound desc = could not find container \"709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10\": container with ID starting with 709d821e9862754d261f5fd1eade869c36d7ef8db10641b46bbb5ee829981e10 not found: ID does not exist" Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.712633 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dcjv5"] Dec 03 06:51:21 crc kubenswrapper[4475]: I1203 06:51:21.715007 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dcjv5"] Dec 03 06:51:23 crc kubenswrapper[4475]: I1203 06:51:23.496263 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe214ce1-0821-4547-ac8b-e001a0579495" path="/var/lib/kubelet/pods/fe214ce1-0821-4547-ac8b-e001a0579495/volumes" Dec 03 06:51:28 crc kubenswrapper[4475]: I1203 06:51:28.933099 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:51:28 crc kubenswrapper[4475]: I1203 06:51:28.933333 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:51:58 crc kubenswrapper[4475]: I1203 06:51:58.933090 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:51:58 crc kubenswrapper[4475]: I1203 06:51:58.933402 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:52:28 crc kubenswrapper[4475]: I1203 06:52:28.932893 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:52:28 crc kubenswrapper[4475]: I1203 06:52:28.933250 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:52:28 crc kubenswrapper[4475]: I1203 06:52:28.933286 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:52:28 crc kubenswrapper[4475]: I1203 06:52:28.933737 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6de2d401c62c0b82b84c560e7fbdf0f3aa849cd94b4d5542285bedcc76efb375"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:52:28 crc kubenswrapper[4475]: I1203 06:52:28.933789 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://6de2d401c62c0b82b84c560e7fbdf0f3aa849cd94b4d5542285bedcc76efb375" gracePeriod=600 Dec 03 06:52:29 crc kubenswrapper[4475]: I1203 06:52:29.939692 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="6de2d401c62c0b82b84c560e7fbdf0f3aa849cd94b4d5542285bedcc76efb375" exitCode=0 Dec 03 06:52:29 crc kubenswrapper[4475]: I1203 06:52:29.940038 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"6de2d401c62c0b82b84c560e7fbdf0f3aa849cd94b4d5542285bedcc76efb375"} Dec 03 06:52:29 crc kubenswrapper[4475]: I1203 06:52:29.940062 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"0a13f575406937575ec2819856647e11d9c4ccb5a9ec17bf4568eec9af01a7ba"} Dec 03 06:52:29 crc kubenswrapper[4475]: I1203 06:52:29.940076 4475 scope.go:117] "RemoveContainer" containerID="159d103ae2d5d19ea94c57a59b534773f0e32f4cb379a412b63ca743e221096e" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.676859 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9tjf5"] Dec 03 06:53:32 crc kubenswrapper[4475]: E1203 06:53:32.677327 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe214ce1-0821-4547-ac8b-e001a0579495" containerName="registry" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.677338 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe214ce1-0821-4547-ac8b-e001a0579495" containerName="registry" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.677427 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe214ce1-0821-4547-ac8b-e001a0579495" containerName="registry" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.677746 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-9tjf5" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.679606 4475 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-s545m" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.686868 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.686883 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.707496 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dqgcw"] Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.708199 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-dqgcw" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.709978 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kq2bw"] Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.710486 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-kq2bw" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.710898 4475 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qlq6j" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.714359 4475 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-mf9tx" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.718133 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dqgcw"] Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.725473 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kq2bw"] Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.731613 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9tjf5"] Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.834500 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqtx8\" (UniqueName: \"kubernetes.io/projected/793168fd-45ca-4006-baa7-311925e445ca-kube-api-access-pqtx8\") pod \"cert-manager-5b446d88c5-kq2bw\" (UID: \"793168fd-45ca-4006-baa7-311925e445ca\") " pod="cert-manager/cert-manager-5b446d88c5-kq2bw" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.834552 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqkmw\" (UniqueName: \"kubernetes.io/projected/caf87b71-bd2a-4def-84ae-8286ecddf3e4-kube-api-access-hqkmw\") pod \"cert-manager-cainjector-7f985d654d-9tjf5\" (UID: \"caf87b71-bd2a-4def-84ae-8286ecddf3e4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9tjf5" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.834579 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkcj2\" (UniqueName: \"kubernetes.io/projected/37204cf9-3fa6-4f34-9b07-ae7f62ee2916-kube-api-access-tkcj2\") pod \"cert-manager-webhook-5655c58dd6-dqgcw\" (UID: \"37204cf9-3fa6-4f34-9b07-ae7f62ee2916\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dqgcw" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.935725 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqtx8\" (UniqueName: \"kubernetes.io/projected/793168fd-45ca-4006-baa7-311925e445ca-kube-api-access-pqtx8\") pod \"cert-manager-5b446d88c5-kq2bw\" (UID: \"793168fd-45ca-4006-baa7-311925e445ca\") " pod="cert-manager/cert-manager-5b446d88c5-kq2bw" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.935769 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqkmw\" (UniqueName: \"kubernetes.io/projected/caf87b71-bd2a-4def-84ae-8286ecddf3e4-kube-api-access-hqkmw\") pod \"cert-manager-cainjector-7f985d654d-9tjf5\" (UID: \"caf87b71-bd2a-4def-84ae-8286ecddf3e4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9tjf5" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.935798 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkcj2\" (UniqueName: \"kubernetes.io/projected/37204cf9-3fa6-4f34-9b07-ae7f62ee2916-kube-api-access-tkcj2\") pod \"cert-manager-webhook-5655c58dd6-dqgcw\" (UID: \"37204cf9-3fa6-4f34-9b07-ae7f62ee2916\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dqgcw" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.950635 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqkmw\" (UniqueName: \"kubernetes.io/projected/caf87b71-bd2a-4def-84ae-8286ecddf3e4-kube-api-access-hqkmw\") pod \"cert-manager-cainjector-7f985d654d-9tjf5\" (UID: \"caf87b71-bd2a-4def-84ae-8286ecddf3e4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9tjf5" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.950657 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkcj2\" (UniqueName: \"kubernetes.io/projected/37204cf9-3fa6-4f34-9b07-ae7f62ee2916-kube-api-access-tkcj2\") pod \"cert-manager-webhook-5655c58dd6-dqgcw\" (UID: \"37204cf9-3fa6-4f34-9b07-ae7f62ee2916\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dqgcw" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.951202 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqtx8\" (UniqueName: \"kubernetes.io/projected/793168fd-45ca-4006-baa7-311925e445ca-kube-api-access-pqtx8\") pod \"cert-manager-5b446d88c5-kq2bw\" (UID: \"793168fd-45ca-4006-baa7-311925e445ca\") " pod="cert-manager/cert-manager-5b446d88c5-kq2bw" Dec 03 06:53:32 crc kubenswrapper[4475]: I1203 06:53:32.990091 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-9tjf5" Dec 03 06:53:33 crc kubenswrapper[4475]: I1203 06:53:33.019195 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-dqgcw" Dec 03 06:53:33 crc kubenswrapper[4475]: I1203 06:53:33.024013 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-kq2bw" Dec 03 06:53:33 crc kubenswrapper[4475]: I1203 06:53:33.124141 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9tjf5"] Dec 03 06:53:33 crc kubenswrapper[4475]: I1203 06:53:33.136087 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 06:53:33 crc kubenswrapper[4475]: I1203 06:53:33.162531 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-9tjf5" event={"ID":"caf87b71-bd2a-4def-84ae-8286ecddf3e4","Type":"ContainerStarted","Data":"f0cbd16558a1e7ba782be18bed2733ac4c8f2270f000b568350f3f77c34ac013"} Dec 03 06:53:33 crc kubenswrapper[4475]: I1203 06:53:33.178277 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kq2bw"] Dec 03 06:53:33 crc kubenswrapper[4475]: W1203 06:53:33.183058 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod793168fd_45ca_4006_baa7_311925e445ca.slice/crio-12c0aa1f753104a1de8ea6eb5746a1c3f9ccd5bfb16c0af6577519e18057df82 WatchSource:0}: Error finding container 12c0aa1f753104a1de8ea6eb5746a1c3f9ccd5bfb16c0af6577519e18057df82: Status 404 returned error can't find the container with id 12c0aa1f753104a1de8ea6eb5746a1c3f9ccd5bfb16c0af6577519e18057df82 Dec 03 06:53:33 crc kubenswrapper[4475]: I1203 06:53:33.217069 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dqgcw"] Dec 03 06:53:33 crc kubenswrapper[4475]: W1203 06:53:33.219135 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37204cf9_3fa6_4f34_9b07_ae7f62ee2916.slice/crio-d42ae4e1936ff8f1fb54801a2d578ee8a8cff9c3903adaf9e86de534a4433528 WatchSource:0}: Error finding container d42ae4e1936ff8f1fb54801a2d578ee8a8cff9c3903adaf9e86de534a4433528: Status 404 returned error can't find the container with id d42ae4e1936ff8f1fb54801a2d578ee8a8cff9c3903adaf9e86de534a4433528 Dec 03 06:53:34 crc kubenswrapper[4475]: I1203 06:53:34.166985 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-kq2bw" event={"ID":"793168fd-45ca-4006-baa7-311925e445ca","Type":"ContainerStarted","Data":"12c0aa1f753104a1de8ea6eb5746a1c3f9ccd5bfb16c0af6577519e18057df82"} Dec 03 06:53:34 crc kubenswrapper[4475]: I1203 06:53:34.167704 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-dqgcw" event={"ID":"37204cf9-3fa6-4f34-9b07-ae7f62ee2916","Type":"ContainerStarted","Data":"d42ae4e1936ff8f1fb54801a2d578ee8a8cff9c3903adaf9e86de534a4433528"} Dec 03 06:53:36 crc kubenswrapper[4475]: I1203 06:53:36.176921 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-dqgcw" event={"ID":"37204cf9-3fa6-4f34-9b07-ae7f62ee2916","Type":"ContainerStarted","Data":"3f503efb6057ad82b4dbcc533029deaa57dcd40448eeec951126556e22cdbf24"} Dec 03 06:53:36 crc kubenswrapper[4475]: I1203 06:53:36.177151 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-dqgcw" Dec 03 06:53:36 crc kubenswrapper[4475]: I1203 06:53:36.179248 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-9tjf5" event={"ID":"caf87b71-bd2a-4def-84ae-8286ecddf3e4","Type":"ContainerStarted","Data":"bd332186454486afbe8ea3e6ed14f6bf74651ff6f13c8caf937685bc89e6a4bf"} Dec 03 06:53:36 crc kubenswrapper[4475]: I1203 06:53:36.180537 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-kq2bw" event={"ID":"793168fd-45ca-4006-baa7-311925e445ca","Type":"ContainerStarted","Data":"7f69ea78e432a70bb8050bf03085acc17f453fe9a2e0e6583cb6c057489a19fb"} Dec 03 06:53:36 crc kubenswrapper[4475]: I1203 06:53:36.190220 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-dqgcw" podStartSLOduration=1.831370551 podStartE2EDuration="4.190206168s" podCreationTimestamp="2025-12-03 06:53:32 +0000 UTC" firstStartedPulling="2025-12-03 06:53:33.222137113 +0000 UTC m=+498.027035447" lastFinishedPulling="2025-12-03 06:53:35.580972729 +0000 UTC m=+500.385871064" observedRunningTime="2025-12-03 06:53:36.188376017 +0000 UTC m=+500.993274351" watchObservedRunningTime="2025-12-03 06:53:36.190206168 +0000 UTC m=+500.995104502" Dec 03 06:53:36 crc kubenswrapper[4475]: I1203 06:53:36.198946 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-9tjf5" podStartSLOduration=1.758650609 podStartE2EDuration="4.198919209s" podCreationTimestamp="2025-12-03 06:53:32 +0000 UTC" firstStartedPulling="2025-12-03 06:53:33.135885409 +0000 UTC m=+497.940783744" lastFinishedPulling="2025-12-03 06:53:35.57615401 +0000 UTC m=+500.381052344" observedRunningTime="2025-12-03 06:53:36.196278855 +0000 UTC m=+501.001177189" watchObservedRunningTime="2025-12-03 06:53:36.198919209 +0000 UTC m=+501.003817543" Dec 03 06:53:36 crc kubenswrapper[4475]: I1203 06:53:36.207557 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-kq2bw" podStartSLOduration=1.8099227660000001 podStartE2EDuration="4.207547431s" podCreationTimestamp="2025-12-03 06:53:32 +0000 UTC" firstStartedPulling="2025-12-03 06:53:33.184969934 +0000 UTC m=+497.989868269" lastFinishedPulling="2025-12-03 06:53:35.5825946 +0000 UTC m=+500.387492934" observedRunningTime="2025-12-03 06:53:36.205809274 +0000 UTC m=+501.010707608" watchObservedRunningTime="2025-12-03 06:53:36.207547431 +0000 UTC m=+501.012445766" Dec 03 06:53:43 crc kubenswrapper[4475]: I1203 06:53:43.021983 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-dqgcw" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.076825 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g9t4l"] Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.077505 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovn-controller" containerID="cri-o://5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192" gracePeriod=30 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.077633 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375" gracePeriod=30 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.077664 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="sbdb" containerID="cri-o://66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78" gracePeriod=30 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.077675 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="northd" containerID="cri-o://b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a" gracePeriod=30 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.077736 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovn-acl-logging" containerID="cri-o://32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9" gracePeriod=30 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.077592 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="nbdb" containerID="cri-o://60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92" gracePeriod=30 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.079557 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="kube-rbac-proxy-node" containerID="cri-o://53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5" gracePeriod=30 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.115949 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" containerID="cri-o://a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88" gracePeriod=30 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.211138 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/3.log" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.214569 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovn-acl-logging/0.log" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.215032 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovn-controller/0.log" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.215654 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375" exitCode=0 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.215677 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5" exitCode=0 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.215687 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9" exitCode=143 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.215694 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192" exitCode=143 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.215729 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375"} Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.215760 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5"} Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.215773 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9"} Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.215792 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192"} Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.217053 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9b2j8_f3a17c67-95e0-4889-8a30-64c08b6720f4/kube-multus/2.log" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.217400 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9b2j8_f3a17c67-95e0-4889-8a30-64c08b6720f4/kube-multus/1.log" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.217432 4475 generic.go:334] "Generic (PLEG): container finished" podID="f3a17c67-95e0-4889-8a30-64c08b6720f4" containerID="2e2971b82e4f9806c53d67763a76ebe8ebaaf116ff13a887e7d02d3fd665eafe" exitCode=2 Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.217475 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9b2j8" event={"ID":"f3a17c67-95e0-4889-8a30-64c08b6720f4","Type":"ContainerDied","Data":"2e2971b82e4f9806c53d67763a76ebe8ebaaf116ff13a887e7d02d3fd665eafe"} Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.217516 4475 scope.go:117] "RemoveContainer" containerID="4124e8c8426150d1057ec040dd3bfd12c7def09c85144927fd48515e9e9e9685" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.217905 4475 scope.go:117] "RemoveContainer" containerID="2e2971b82e4f9806c53d67763a76ebe8ebaaf116ff13a887e7d02d3fd665eafe" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.218052 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9b2j8_openshift-multus(f3a17c67-95e0-4889-8a30-64c08b6720f4)\"" pod="openshift-multus/multus-9b2j8" podUID="f3a17c67-95e0-4889-8a30-64c08b6720f4" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.336654 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/3.log" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.338594 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovn-acl-logging/0.log" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.338964 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovn-controller/0.log" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.339270 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.377966 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qzvpv"] Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378139 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378155 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378163 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="kubecfg-setup" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378168 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="kubecfg-setup" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378175 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovn-acl-logging" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378180 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovn-acl-logging" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378189 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378195 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378202 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378207 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378216 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="northd" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378221 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="northd" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378227 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="nbdb" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378232 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="nbdb" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378241 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="sbdb" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378246 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="sbdb" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378253 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378259 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378267 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378272 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378279 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovn-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378284 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovn-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378292 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="kube-rbac-proxy-node" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378297 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="kube-rbac-proxy-node" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378375 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="sbdb" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378384 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="nbdb" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378390 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378400 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="northd" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378407 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovn-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378413 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovn-acl-logging" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378419 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378425 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378430 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378438 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="kube-rbac-proxy-node" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378445 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: E1203 06:53:44.378544 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378551 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.378622 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerName="ovnkube-controller" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.380043 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450298 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-script-lib\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450346 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-log-socket\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450372 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f42839e-dbc4-445a-a15b-c3aa14813958-ovn-node-metrics-cert\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450391 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppdm2\" (UniqueName: \"kubernetes.io/projected/8f42839e-dbc4-445a-a15b-c3aa14813958-kube-api-access-ppdm2\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450411 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-systemd-units\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450427 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-kubelet\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450436 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-log-socket" (OuterVolumeSpecName: "log-socket") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450478 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-bin\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450521 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450528 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-openvswitch\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450566 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-ovn\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450596 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-etc-openvswitch\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450609 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-systemd\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450626 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-env-overrides\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450651 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-ovn-kubernetes\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450662 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-netns\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450678 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450691 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-node-log\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450707 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-netd\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450719 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-slash\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450737 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-config\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450754 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-var-lib-openvswitch\") pod \"8f42839e-dbc4-445a-a15b-c3aa14813958\" (UID: \"8f42839e-dbc4-445a-a15b-c3aa14813958\") " Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450773 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450816 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450836 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450857 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450877 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450893 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450919 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450966 4475 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450977 4475 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450986 4475 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.450994 4475 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451002 4475 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451010 4475 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451017 4475 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451024 4475 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451032 4475 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451052 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451071 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-node-log" (OuterVolumeSpecName: "node-log") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451087 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451102 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-slash" (OuterVolumeSpecName: "host-slash") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451118 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451159 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451405 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.451440 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.454922 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f42839e-dbc4-445a-a15b-c3aa14813958-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.455025 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f42839e-dbc4-445a-a15b-c3aa14813958-kube-api-access-ppdm2" (OuterVolumeSpecName: "kube-api-access-ppdm2") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "kube-api-access-ppdm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.460417 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8f42839e-dbc4-445a-a15b-c3aa14813958" (UID: "8f42839e-dbc4-445a-a15b-c3aa14813958"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.551882 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-run-openvswitch\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.551919 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.551942 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-run-ovn\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.551996 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-systemd-units\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552015 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d4f2e301-842d-4132-b9ad-181fbfbba4a6-ovnkube-script-lib\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552032 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn74h\" (UniqueName: \"kubernetes.io/projected/d4f2e301-842d-4132-b9ad-181fbfbba4a6-kube-api-access-fn74h\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552050 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-cni-netd\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552069 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d4f2e301-842d-4132-b9ad-181fbfbba4a6-ovn-node-metrics-cert\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552089 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d4f2e301-842d-4132-b9ad-181fbfbba4a6-env-overrides\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552101 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-cni-bin\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552118 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-kubelet\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552135 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552170 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-run-systemd\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552192 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-log-socket\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552206 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-etc-openvswitch\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552220 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d4f2e301-842d-4132-b9ad-181fbfbba4a6-ovnkube-config\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552237 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-node-log\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552250 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-run-netns\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552357 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-var-lib-openvswitch\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552408 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-slash\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552445 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppdm2\" (UniqueName: \"kubernetes.io/projected/8f42839e-dbc4-445a-a15b-c3aa14813958-kube-api-access-ppdm2\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552471 4475 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552480 4475 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552489 4475 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552497 4475 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552505 4475 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552513 4475 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552541 4475 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552548 4475 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f42839e-dbc4-445a-a15b-c3aa14813958-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552556 4475 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f42839e-dbc4-445a-a15b-c3aa14813958-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.552564 4475 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f42839e-dbc4-445a-a15b-c3aa14813958-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653493 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-var-lib-openvswitch\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653526 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-slash\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653555 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-run-openvswitch\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653573 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653592 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-run-ovn\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653608 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-systemd-units\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653642 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d4f2e301-842d-4132-b9ad-181fbfbba4a6-ovnkube-script-lib\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653656 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn74h\" (UniqueName: \"kubernetes.io/projected/d4f2e301-842d-4132-b9ad-181fbfbba4a6-kube-api-access-fn74h\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653672 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-cni-netd\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653692 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d4f2e301-842d-4132-b9ad-181fbfbba4a6-ovn-node-metrics-cert\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653713 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-cni-bin\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653725 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d4f2e301-842d-4132-b9ad-181fbfbba4a6-env-overrides\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653741 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-kubelet\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653761 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653774 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-run-systemd\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653799 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-log-socket\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653812 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-etc-openvswitch\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653826 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d4f2e301-842d-4132-b9ad-181fbfbba4a6-ovnkube-config\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653847 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-node-log\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653860 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-run-netns\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653914 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-run-netns\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653943 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-var-lib-openvswitch\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653960 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-slash\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653977 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-run-openvswitch\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.653994 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.654012 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-run-ovn\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.654029 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-systemd-units\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.654287 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-kubelet\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.654609 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-log-socket\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.654663 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.654688 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-run-systemd\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.654707 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-etc-openvswitch\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.654729 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-cni-bin\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.654748 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-node-log\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.654984 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d4f2e301-842d-4132-b9ad-181fbfbba4a6-host-cni-netd\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.655024 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d4f2e301-842d-4132-b9ad-181fbfbba4a6-ovnkube-script-lib\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.655067 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d4f2e301-842d-4132-b9ad-181fbfbba4a6-ovnkube-config\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.655237 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d4f2e301-842d-4132-b9ad-181fbfbba4a6-env-overrides\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.656894 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d4f2e301-842d-4132-b9ad-181fbfbba4a6-ovn-node-metrics-cert\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.666566 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn74h\" (UniqueName: \"kubernetes.io/projected/d4f2e301-842d-4132-b9ad-181fbfbba4a6-kube-api-access-fn74h\") pod \"ovnkube-node-qzvpv\" (UID: \"d4f2e301-842d-4132-b9ad-181fbfbba4a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:44 crc kubenswrapper[4475]: I1203 06:53:44.691389 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.222584 4475 generic.go:334] "Generic (PLEG): container finished" podID="d4f2e301-842d-4132-b9ad-181fbfbba4a6" containerID="f132ff79fcdb1f991102dd4f5b1abeb964daab06b955e8f492bb1efaf2130520" exitCode=0 Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.222647 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" event={"ID":"d4f2e301-842d-4132-b9ad-181fbfbba4a6","Type":"ContainerDied","Data":"f132ff79fcdb1f991102dd4f5b1abeb964daab06b955e8f492bb1efaf2130520"} Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.222898 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" event={"ID":"d4f2e301-842d-4132-b9ad-181fbfbba4a6","Type":"ContainerStarted","Data":"f0a413c3d785c07ed5b705ad4e9246914d5e76aadee1ad71959e7888c318699a"} Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.224210 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovnkube-controller/3.log" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.226161 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovn-acl-logging/0.log" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.226587 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9t4l_8f42839e-dbc4-445a-a15b-c3aa14813958/ovn-controller/0.log" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.226899 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88" exitCode=0 Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.226919 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78" exitCode=0 Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.226929 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92" exitCode=0 Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.226939 4475 generic.go:334] "Generic (PLEG): container finished" podID="8f42839e-dbc4-445a-a15b-c3aa14813958" containerID="b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a" exitCode=0 Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.226946 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88"} Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.226978 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.226985 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78"} Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.226997 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92"} Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.227007 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a"} Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.227015 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9t4l" event={"ID":"8f42839e-dbc4-445a-a15b-c3aa14813958","Type":"ContainerDied","Data":"b4b141100ea052faea009e86b4836d44db60d742453d01254879de450e50a718"} Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.227029 4475 scope.go:117] "RemoveContainer" containerID="a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.228359 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9b2j8_f3a17c67-95e0-4889-8a30-64c08b6720f4/kube-multus/2.log" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.264050 4475 scope.go:117] "RemoveContainer" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.282471 4475 scope.go:117] "RemoveContainer" containerID="66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.298210 4475 scope.go:117] "RemoveContainer" containerID="60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.303068 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g9t4l"] Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.307164 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g9t4l"] Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.326434 4475 scope.go:117] "RemoveContainer" containerID="b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.340635 4475 scope.go:117] "RemoveContainer" containerID="a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.353406 4475 scope.go:117] "RemoveContainer" containerID="53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.370666 4475 scope.go:117] "RemoveContainer" containerID="32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.391447 4475 scope.go:117] "RemoveContainer" containerID="5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.404075 4475 scope.go:117] "RemoveContainer" containerID="400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.434021 4475 scope.go:117] "RemoveContainer" containerID="a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88" Dec 03 06:53:45 crc kubenswrapper[4475]: E1203 06:53:45.434274 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88\": container with ID starting with a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88 not found: ID does not exist" containerID="a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.434301 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88"} err="failed to get container status \"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88\": rpc error: code = NotFound desc = could not find container \"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88\": container with ID starting with a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.434320 4475 scope.go:117] "RemoveContainer" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" Dec 03 06:53:45 crc kubenswrapper[4475]: E1203 06:53:45.434697 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\": container with ID starting with c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141 not found: ID does not exist" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.434730 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141"} err="failed to get container status \"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\": rpc error: code = NotFound desc = could not find container \"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\": container with ID starting with c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.434751 4475 scope.go:117] "RemoveContainer" containerID="66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78" Dec 03 06:53:45 crc kubenswrapper[4475]: E1203 06:53:45.434963 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\": container with ID starting with 66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78 not found: ID does not exist" containerID="66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.434986 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78"} err="failed to get container status \"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\": rpc error: code = NotFound desc = could not find container \"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\": container with ID starting with 66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.435001 4475 scope.go:117] "RemoveContainer" containerID="60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92" Dec 03 06:53:45 crc kubenswrapper[4475]: E1203 06:53:45.435309 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\": container with ID starting with 60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92 not found: ID does not exist" containerID="60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.435335 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92"} err="failed to get container status \"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\": rpc error: code = NotFound desc = could not find container \"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\": container with ID starting with 60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.435350 4475 scope.go:117] "RemoveContainer" containerID="b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a" Dec 03 06:53:45 crc kubenswrapper[4475]: E1203 06:53:45.435894 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\": container with ID starting with b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a not found: ID does not exist" containerID="b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.435917 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a"} err="failed to get container status \"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\": rpc error: code = NotFound desc = could not find container \"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\": container with ID starting with b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.435931 4475 scope.go:117] "RemoveContainer" containerID="a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375" Dec 03 06:53:45 crc kubenswrapper[4475]: E1203 06:53:45.436218 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\": container with ID starting with a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375 not found: ID does not exist" containerID="a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.436237 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375"} err="failed to get container status \"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\": rpc error: code = NotFound desc = could not find container \"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\": container with ID starting with a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.436249 4475 scope.go:117] "RemoveContainer" containerID="53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5" Dec 03 06:53:45 crc kubenswrapper[4475]: E1203 06:53:45.437459 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\": container with ID starting with 53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5 not found: ID does not exist" containerID="53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.437500 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5"} err="failed to get container status \"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\": rpc error: code = NotFound desc = could not find container \"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\": container with ID starting with 53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.437525 4475 scope.go:117] "RemoveContainer" containerID="32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9" Dec 03 06:53:45 crc kubenswrapper[4475]: E1203 06:53:45.437761 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\": container with ID starting with 32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9 not found: ID does not exist" containerID="32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.437789 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9"} err="failed to get container status \"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\": rpc error: code = NotFound desc = could not find container \"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\": container with ID starting with 32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.437803 4475 scope.go:117] "RemoveContainer" containerID="5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192" Dec 03 06:53:45 crc kubenswrapper[4475]: E1203 06:53:45.437977 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\": container with ID starting with 5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192 not found: ID does not exist" containerID="5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.437990 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192"} err="failed to get container status \"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\": rpc error: code = NotFound desc = could not find container \"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\": container with ID starting with 5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.438001 4475 scope.go:117] "RemoveContainer" containerID="400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b" Dec 03 06:53:45 crc kubenswrapper[4475]: E1203 06:53:45.438149 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\": container with ID starting with 400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b not found: ID does not exist" containerID="400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.438163 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b"} err="failed to get container status \"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\": rpc error: code = NotFound desc = could not find container \"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\": container with ID starting with 400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.438175 4475 scope.go:117] "RemoveContainer" containerID="a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.438329 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88"} err="failed to get container status \"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88\": rpc error: code = NotFound desc = could not find container \"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88\": container with ID starting with a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.438342 4475 scope.go:117] "RemoveContainer" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.439768 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141"} err="failed to get container status \"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\": rpc error: code = NotFound desc = could not find container \"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\": container with ID starting with c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.439802 4475 scope.go:117] "RemoveContainer" containerID="66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.440004 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78"} err="failed to get container status \"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\": rpc error: code = NotFound desc = could not find container \"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\": container with ID starting with 66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.440024 4475 scope.go:117] "RemoveContainer" containerID="60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.440247 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92"} err="failed to get container status \"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\": rpc error: code = NotFound desc = could not find container \"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\": container with ID starting with 60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.440279 4475 scope.go:117] "RemoveContainer" containerID="b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.440963 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a"} err="failed to get container status \"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\": rpc error: code = NotFound desc = could not find container \"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\": container with ID starting with b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.440981 4475 scope.go:117] "RemoveContainer" containerID="a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.441387 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375"} err="failed to get container status \"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\": rpc error: code = NotFound desc = could not find container \"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\": container with ID starting with a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.441431 4475 scope.go:117] "RemoveContainer" containerID="53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.441677 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5"} err="failed to get container status \"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\": rpc error: code = NotFound desc = could not find container \"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\": container with ID starting with 53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.441694 4475 scope.go:117] "RemoveContainer" containerID="32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.441915 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9"} err="failed to get container status \"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\": rpc error: code = NotFound desc = could not find container \"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\": container with ID starting with 32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.441936 4475 scope.go:117] "RemoveContainer" containerID="5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.442164 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192"} err="failed to get container status \"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\": rpc error: code = NotFound desc = could not find container \"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\": container with ID starting with 5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.442181 4475 scope.go:117] "RemoveContainer" containerID="400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.442389 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b"} err="failed to get container status \"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\": rpc error: code = NotFound desc = could not find container \"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\": container with ID starting with 400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.442412 4475 scope.go:117] "RemoveContainer" containerID="a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.442625 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88"} err="failed to get container status \"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88\": rpc error: code = NotFound desc = could not find container \"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88\": container with ID starting with a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.442648 4475 scope.go:117] "RemoveContainer" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.442873 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141"} err="failed to get container status \"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\": rpc error: code = NotFound desc = could not find container \"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\": container with ID starting with c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.442894 4475 scope.go:117] "RemoveContainer" containerID="66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.443208 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78"} err="failed to get container status \"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\": rpc error: code = NotFound desc = could not find container \"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\": container with ID starting with 66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.443228 4475 scope.go:117] "RemoveContainer" containerID="60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.443386 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92"} err="failed to get container status \"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\": rpc error: code = NotFound desc = could not find container \"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\": container with ID starting with 60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.443405 4475 scope.go:117] "RemoveContainer" containerID="b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.443580 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a"} err="failed to get container status \"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\": rpc error: code = NotFound desc = could not find container \"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\": container with ID starting with b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.443597 4475 scope.go:117] "RemoveContainer" containerID="a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.443840 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375"} err="failed to get container status \"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\": rpc error: code = NotFound desc = could not find container \"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\": container with ID starting with a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.443860 4475 scope.go:117] "RemoveContainer" containerID="53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.444923 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5"} err="failed to get container status \"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\": rpc error: code = NotFound desc = could not find container \"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\": container with ID starting with 53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.444941 4475 scope.go:117] "RemoveContainer" containerID="32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.445142 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9"} err="failed to get container status \"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\": rpc error: code = NotFound desc = could not find container \"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\": container with ID starting with 32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.445167 4475 scope.go:117] "RemoveContainer" containerID="5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.445363 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192"} err="failed to get container status \"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\": rpc error: code = NotFound desc = could not find container \"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\": container with ID starting with 5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.445380 4475 scope.go:117] "RemoveContainer" containerID="400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.445559 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b"} err="failed to get container status \"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\": rpc error: code = NotFound desc = could not find container \"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\": container with ID starting with 400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.445577 4475 scope.go:117] "RemoveContainer" containerID="a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.445750 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88"} err="failed to get container status \"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88\": rpc error: code = NotFound desc = could not find container \"a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88\": container with ID starting with a6d3e6ea4e349dee8dc1ae7d0814640bae610b259e41099993afa21cb3b1aa88 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.445769 4475 scope.go:117] "RemoveContainer" containerID="c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.445978 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141"} err="failed to get container status \"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\": rpc error: code = NotFound desc = could not find container \"c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141\": container with ID starting with c3d53d023db886d8a8772c0790104577a7a8914b8cf882b251e44407064c3141 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.445995 4475 scope.go:117] "RemoveContainer" containerID="66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.446157 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78"} err="failed to get container status \"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\": rpc error: code = NotFound desc = could not find container \"66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78\": container with ID starting with 66a9c7568957099255bc910496da695e2af0122f2c853c3e221c666d7c2dee78 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.446173 4475 scope.go:117] "RemoveContainer" containerID="60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.446344 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92"} err="failed to get container status \"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\": rpc error: code = NotFound desc = could not find container \"60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92\": container with ID starting with 60d3ec7cab1f249e81ae1db9ab97fa02e8b3c9d8376af4c6682dc3fc6f9d6d92 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.446363 4475 scope.go:117] "RemoveContainer" containerID="b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.446534 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a"} err="failed to get container status \"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\": rpc error: code = NotFound desc = could not find container \"b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a\": container with ID starting with b3243c863a4fb593b39fc3e3b835f647e9373d8b2dec69c5ff7657ed73c8f78a not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.446546 4475 scope.go:117] "RemoveContainer" containerID="a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.446704 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375"} err="failed to get container status \"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\": rpc error: code = NotFound desc = could not find container \"a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375\": container with ID starting with a5090474cca8b8e2ed539ea74377506638d300be7eb750b3f3285477d8c9a375 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.446722 4475 scope.go:117] "RemoveContainer" containerID="53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.446866 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5"} err="failed to get container status \"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\": rpc error: code = NotFound desc = could not find container \"53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5\": container with ID starting with 53948489397bbbfdf5f766211088d7f12fcd2dfbc8c3da6493e5abc49e3b41f5 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.446885 4475 scope.go:117] "RemoveContainer" containerID="32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.447050 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9"} err="failed to get container status \"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\": rpc error: code = NotFound desc = could not find container \"32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9\": container with ID starting with 32897756f3658fda95db77180a0553a9d8656ed49c3ae5a017d32f5c5133a5a9 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.447062 4475 scope.go:117] "RemoveContainer" containerID="5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.447194 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192"} err="failed to get container status \"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\": rpc error: code = NotFound desc = could not find container \"5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192\": container with ID starting with 5e288f95676d5823cd3cb005318489d2f629a8fb74ad17ce6a67978d76006192 not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.447209 4475 scope.go:117] "RemoveContainer" containerID="400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.447338 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b"} err="failed to get container status \"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\": rpc error: code = NotFound desc = could not find container \"400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b\": container with ID starting with 400610ebcdc7d47ecc1345287847a1909871411a12cdb3cbf895e05039b81c2b not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4475]: I1203 06:53:45.496186 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f42839e-dbc4-445a-a15b-c3aa14813958" path="/var/lib/kubelet/pods/8f42839e-dbc4-445a-a15b-c3aa14813958/volumes" Dec 03 06:53:46 crc kubenswrapper[4475]: I1203 06:53:46.235037 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" event={"ID":"d4f2e301-842d-4132-b9ad-181fbfbba4a6","Type":"ContainerStarted","Data":"7122a53ca8c57c4c29ffcfb92ad055095323e776939371acca8709e6cb5e3e2a"} Dec 03 06:53:46 crc kubenswrapper[4475]: I1203 06:53:46.235209 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" event={"ID":"d4f2e301-842d-4132-b9ad-181fbfbba4a6","Type":"ContainerStarted","Data":"b16c38c5ba5c9fdb13ff2507b30f30719ba80a00d6de3d0ec682a4efd3005436"} Dec 03 06:53:46 crc kubenswrapper[4475]: I1203 06:53:46.235220 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" event={"ID":"d4f2e301-842d-4132-b9ad-181fbfbba4a6","Type":"ContainerStarted","Data":"6665de1adc9e99b91d88c56abe922e047c33fda96fbfa8aa2f1ebb5cc3edb3ac"} Dec 03 06:53:46 crc kubenswrapper[4475]: I1203 06:53:46.235228 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" event={"ID":"d4f2e301-842d-4132-b9ad-181fbfbba4a6","Type":"ContainerStarted","Data":"916e8068c4faeb5b2e90ed6fcc5d032952c2b00f1b59533cdfdc4199086fc9d2"} Dec 03 06:53:46 crc kubenswrapper[4475]: I1203 06:53:46.235235 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" event={"ID":"d4f2e301-842d-4132-b9ad-181fbfbba4a6","Type":"ContainerStarted","Data":"c5e9ea64dec95624dd7af7cfac4ca9ad4f4358dc6fa89fc00c1291724113a2b9"} Dec 03 06:53:46 crc kubenswrapper[4475]: I1203 06:53:46.235244 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" event={"ID":"d4f2e301-842d-4132-b9ad-181fbfbba4a6","Type":"ContainerStarted","Data":"93008d6834489aed6b84e6258a87f79eb1331cffc2c3e17cb4d9d5337369c3a2"} Dec 03 06:53:48 crc kubenswrapper[4475]: I1203 06:53:48.253075 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" event={"ID":"d4f2e301-842d-4132-b9ad-181fbfbba4a6","Type":"ContainerStarted","Data":"42fa36e7f5fe990d632bd3032f4a2f5d4560673d4248e988e8dfbe27f3186fa0"} Dec 03 06:53:50 crc kubenswrapper[4475]: I1203 06:53:50.264508 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" event={"ID":"d4f2e301-842d-4132-b9ad-181fbfbba4a6","Type":"ContainerStarted","Data":"03d17def51d86f256c5381fc1f525bd272b15c2f7516844c7a064bff546a48a4"} Dec 03 06:53:50 crc kubenswrapper[4475]: I1203 06:53:50.265642 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:50 crc kubenswrapper[4475]: I1203 06:53:50.265674 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:50 crc kubenswrapper[4475]: I1203 06:53:50.265712 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:50 crc kubenswrapper[4475]: I1203 06:53:50.283551 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:50 crc kubenswrapper[4475]: I1203 06:53:50.284978 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:53:50 crc kubenswrapper[4475]: I1203 06:53:50.308463 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" podStartSLOduration=6.308433743 podStartE2EDuration="6.308433743s" podCreationTimestamp="2025-12-03 06:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:53:50.289179953 +0000 UTC m=+515.094078288" watchObservedRunningTime="2025-12-03 06:53:50.308433743 +0000 UTC m=+515.113332077" Dec 03 06:53:54 crc kubenswrapper[4475]: I1203 06:53:54.491072 4475 scope.go:117] "RemoveContainer" containerID="2e2971b82e4f9806c53d67763a76ebe8ebaaf116ff13a887e7d02d3fd665eafe" Dec 03 06:53:54 crc kubenswrapper[4475]: E1203 06:53:54.491400 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9b2j8_openshift-multus(f3a17c67-95e0-4889-8a30-64c08b6720f4)\"" pod="openshift-multus/multus-9b2j8" podUID="f3a17c67-95e0-4889-8a30-64c08b6720f4" Dec 03 06:54:09 crc kubenswrapper[4475]: I1203 06:54:09.491795 4475 scope.go:117] "RemoveContainer" containerID="2e2971b82e4f9806c53d67763a76ebe8ebaaf116ff13a887e7d02d3fd665eafe" Dec 03 06:54:10 crc kubenswrapper[4475]: I1203 06:54:10.338852 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9b2j8_f3a17c67-95e0-4889-8a30-64c08b6720f4/kube-multus/2.log" Dec 03 06:54:10 crc kubenswrapper[4475]: I1203 06:54:10.339057 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9b2j8" event={"ID":"f3a17c67-95e0-4889-8a30-64c08b6720f4","Type":"ContainerStarted","Data":"249f649d4ee97cda3f6116872542008d02533792bab30f74e3a2c84d3010a683"} Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.234778 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd"] Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.235932 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.237553 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.243739 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd"] Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.335121 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2rnb\" (UniqueName: \"kubernetes.io/projected/d039f93a-9cbc-4b89-a072-33a04e41750b-kube-api-access-m2rnb\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.335157 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.335202 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.435955 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2rnb\" (UniqueName: \"kubernetes.io/projected/d039f93a-9cbc-4b89-a072-33a04e41750b-kube-api-access-m2rnb\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.435998 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.436030 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.436406 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.436691 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.450781 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2rnb\" (UniqueName: \"kubernetes.io/projected/d039f93a-9cbc-4b89-a072-33a04e41750b-kube-api-access-m2rnb\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.546842 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:13 crc kubenswrapper[4475]: I1203 06:54:13.880987 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd"] Dec 03 06:54:13 crc kubenswrapper[4475]: W1203 06:54:13.886373 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd039f93a_9cbc_4b89_a072_33a04e41750b.slice/crio-b863283f13933279db01d8980e8ff34c06ca14602a4c14204d0e9180d7da22f8 WatchSource:0}: Error finding container b863283f13933279db01d8980e8ff34c06ca14602a4c14204d0e9180d7da22f8: Status 404 returned error can't find the container with id b863283f13933279db01d8980e8ff34c06ca14602a4c14204d0e9180d7da22f8 Dec 03 06:54:14 crc kubenswrapper[4475]: I1203 06:54:14.355432 4475 generic.go:334] "Generic (PLEG): container finished" podID="d039f93a-9cbc-4b89-a072-33a04e41750b" containerID="901093c6b03ac96acc36b5b023dbc9fe7bc9a60503b6d643cfde9079a46bdb92" exitCode=0 Dec 03 06:54:14 crc kubenswrapper[4475]: I1203 06:54:14.355494 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" event={"ID":"d039f93a-9cbc-4b89-a072-33a04e41750b","Type":"ContainerDied","Data":"901093c6b03ac96acc36b5b023dbc9fe7bc9a60503b6d643cfde9079a46bdb92"} Dec 03 06:54:14 crc kubenswrapper[4475]: I1203 06:54:14.355546 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" event={"ID":"d039f93a-9cbc-4b89-a072-33a04e41750b","Type":"ContainerStarted","Data":"b863283f13933279db01d8980e8ff34c06ca14602a4c14204d0e9180d7da22f8"} Dec 03 06:54:14 crc kubenswrapper[4475]: I1203 06:54:14.707575 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qzvpv" Dec 03 06:54:16 crc kubenswrapper[4475]: I1203 06:54:16.370901 4475 generic.go:334] "Generic (PLEG): container finished" podID="d039f93a-9cbc-4b89-a072-33a04e41750b" containerID="cb15a1cf07f75e78a2d2dcfe3d01e41e2fff5d466210d4e71800c19148b5d04d" exitCode=0 Dec 03 06:54:16 crc kubenswrapper[4475]: I1203 06:54:16.371258 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" event={"ID":"d039f93a-9cbc-4b89-a072-33a04e41750b","Type":"ContainerDied","Data":"cb15a1cf07f75e78a2d2dcfe3d01e41e2fff5d466210d4e71800c19148b5d04d"} Dec 03 06:54:17 crc kubenswrapper[4475]: I1203 06:54:17.380876 4475 generic.go:334] "Generic (PLEG): container finished" podID="d039f93a-9cbc-4b89-a072-33a04e41750b" containerID="1269b02a2c35eeb59e703df5e2d8fca7dd719645127f3c6b654a35f1b072be0f" exitCode=0 Dec 03 06:54:17 crc kubenswrapper[4475]: I1203 06:54:17.380917 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" event={"ID":"d039f93a-9cbc-4b89-a072-33a04e41750b","Type":"ContainerDied","Data":"1269b02a2c35eeb59e703df5e2d8fca7dd719645127f3c6b654a35f1b072be0f"} Dec 03 06:54:18 crc kubenswrapper[4475]: I1203 06:54:18.545210 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:18 crc kubenswrapper[4475]: I1203 06:54:18.689866 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2rnb\" (UniqueName: \"kubernetes.io/projected/d039f93a-9cbc-4b89-a072-33a04e41750b-kube-api-access-m2rnb\") pod \"d039f93a-9cbc-4b89-a072-33a04e41750b\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " Dec 03 06:54:18 crc kubenswrapper[4475]: I1203 06:54:18.689933 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-util\") pod \"d039f93a-9cbc-4b89-a072-33a04e41750b\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " Dec 03 06:54:18 crc kubenswrapper[4475]: I1203 06:54:18.689952 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-bundle\") pod \"d039f93a-9cbc-4b89-a072-33a04e41750b\" (UID: \"d039f93a-9cbc-4b89-a072-33a04e41750b\") " Dec 03 06:54:18 crc kubenswrapper[4475]: I1203 06:54:18.690534 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-bundle" (OuterVolumeSpecName: "bundle") pod "d039f93a-9cbc-4b89-a072-33a04e41750b" (UID: "d039f93a-9cbc-4b89-a072-33a04e41750b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:18 crc kubenswrapper[4475]: I1203 06:54:18.694778 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d039f93a-9cbc-4b89-a072-33a04e41750b-kube-api-access-m2rnb" (OuterVolumeSpecName: "kube-api-access-m2rnb") pod "d039f93a-9cbc-4b89-a072-33a04e41750b" (UID: "d039f93a-9cbc-4b89-a072-33a04e41750b"). InnerVolumeSpecName "kube-api-access-m2rnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:18 crc kubenswrapper[4475]: I1203 06:54:18.700008 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-util" (OuterVolumeSpecName: "util") pod "d039f93a-9cbc-4b89-a072-33a04e41750b" (UID: "d039f93a-9cbc-4b89-a072-33a04e41750b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:18 crc kubenswrapper[4475]: I1203 06:54:18.791469 4475 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-util\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:18 crc kubenswrapper[4475]: I1203 06:54:18.791497 4475 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d039f93a-9cbc-4b89-a072-33a04e41750b-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:18 crc kubenswrapper[4475]: I1203 06:54:18.791511 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2rnb\" (UniqueName: \"kubernetes.io/projected/d039f93a-9cbc-4b89-a072-33a04e41750b-kube-api-access-m2rnb\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:19 crc kubenswrapper[4475]: I1203 06:54:19.389748 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" event={"ID":"d039f93a-9cbc-4b89-a072-33a04e41750b","Type":"ContainerDied","Data":"b863283f13933279db01d8980e8ff34c06ca14602a4c14204d0e9180d7da22f8"} Dec 03 06:54:19 crc kubenswrapper[4475]: I1203 06:54:19.389779 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b863283f13933279db01d8980e8ff34c06ca14602a4c14204d0e9180d7da22f8" Dec 03 06:54:19 crc kubenswrapper[4475]: I1203 06:54:19.389794 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsvmcd" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.082343 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-vr9qp"] Dec 03 06:54:21 crc kubenswrapper[4475]: E1203 06:54:21.082699 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d039f93a-9cbc-4b89-a072-33a04e41750b" containerName="util" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.082710 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d039f93a-9cbc-4b89-a072-33a04e41750b" containerName="util" Dec 03 06:54:21 crc kubenswrapper[4475]: E1203 06:54:21.082724 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d039f93a-9cbc-4b89-a072-33a04e41750b" containerName="extract" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.082730 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d039f93a-9cbc-4b89-a072-33a04e41750b" containerName="extract" Dec 03 06:54:21 crc kubenswrapper[4475]: E1203 06:54:21.082741 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d039f93a-9cbc-4b89-a072-33a04e41750b" containerName="pull" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.082746 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d039f93a-9cbc-4b89-a072-33a04e41750b" containerName="pull" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.082849 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="d039f93a-9cbc-4b89-a072-33a04e41750b" containerName="extract" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.083158 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vr9qp" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.084963 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-72fjb" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.085170 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.085287 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.094846 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-vr9qp"] Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.120309 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fczfx\" (UniqueName: \"kubernetes.io/projected/faedbce8-2bde-4b18-ae3e-1b4b38be191f-kube-api-access-fczfx\") pod \"nmstate-operator-5b5b58f5c8-vr9qp\" (UID: \"faedbce8-2bde-4b18-ae3e-1b4b38be191f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vr9qp" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.220971 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fczfx\" (UniqueName: \"kubernetes.io/projected/faedbce8-2bde-4b18-ae3e-1b4b38be191f-kube-api-access-fczfx\") pod \"nmstate-operator-5b5b58f5c8-vr9qp\" (UID: \"faedbce8-2bde-4b18-ae3e-1b4b38be191f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vr9qp" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.235632 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fczfx\" (UniqueName: \"kubernetes.io/projected/faedbce8-2bde-4b18-ae3e-1b4b38be191f-kube-api-access-fczfx\") pod \"nmstate-operator-5b5b58f5c8-vr9qp\" (UID: \"faedbce8-2bde-4b18-ae3e-1b4b38be191f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vr9qp" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.394358 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vr9qp" Dec 03 06:54:21 crc kubenswrapper[4475]: I1203 06:54:21.745832 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-vr9qp"] Dec 03 06:54:21 crc kubenswrapper[4475]: W1203 06:54:21.750524 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaedbce8_2bde_4b18_ae3e_1b4b38be191f.slice/crio-265a6bc61156a209a6aee42107d12835c282cd3b9853b10f7c3ca0c4322e496e WatchSource:0}: Error finding container 265a6bc61156a209a6aee42107d12835c282cd3b9853b10f7c3ca0c4322e496e: Status 404 returned error can't find the container with id 265a6bc61156a209a6aee42107d12835c282cd3b9853b10f7c3ca0c4322e496e Dec 03 06:54:22 crc kubenswrapper[4475]: I1203 06:54:22.403076 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vr9qp" event={"ID":"faedbce8-2bde-4b18-ae3e-1b4b38be191f","Type":"ContainerStarted","Data":"265a6bc61156a209a6aee42107d12835c282cd3b9853b10f7c3ca0c4322e496e"} Dec 03 06:54:24 crc kubenswrapper[4475]: I1203 06:54:24.416897 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vr9qp" event={"ID":"faedbce8-2bde-4b18-ae3e-1b4b38be191f","Type":"ContainerStarted","Data":"c0fbec80762fba1006eb4ba93275391a7b19353695bd1065d475cd222bcb2f4d"} Dec 03 06:54:24 crc kubenswrapper[4475]: I1203 06:54:24.429397 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vr9qp" podStartSLOduration=1.75930295 podStartE2EDuration="3.429382956s" podCreationTimestamp="2025-12-03 06:54:21 +0000 UTC" firstStartedPulling="2025-12-03 06:54:21.75216978 +0000 UTC m=+546.557068114" lastFinishedPulling="2025-12-03 06:54:23.422249785 +0000 UTC m=+548.227148120" observedRunningTime="2025-12-03 06:54:24.428421868 +0000 UTC m=+549.233320202" watchObservedRunningTime="2025-12-03 06:54:24.429382956 +0000 UTC m=+549.234281289" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.198084 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q"] Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.198947 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.201626 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-25wzr" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.205099 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q"] Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.211178 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9"] Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.211736 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.222469 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.233758 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bcvvb"] Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.234296 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.238350 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9"] Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.266461 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-nmstate-lock\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.266528 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cktbc\" (UniqueName: \"kubernetes.io/projected/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-kube-api-access-cktbc\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.266562 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-dbus-socket\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.266579 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8dx5\" (UniqueName: \"kubernetes.io/projected/1445cccd-0b52-44ac-9c56-ca17499b0bfb-kube-api-access-t8dx5\") pod \"nmstate-metrics-7f946cbc9-qbz9q\" (UID: \"1445cccd-0b52-44ac-9c56-ca17499b0bfb\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.266611 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-ovs-socket\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.266635 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9aa1aae1-b774-43f4-9874-3740e3f74a1f-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-q9gh9\" (UID: \"9aa1aae1-b774-43f4-9874-3740e3f74a1f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.266674 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb4vk\" (UniqueName: \"kubernetes.io/projected/9aa1aae1-b774-43f4-9874-3740e3f74a1f-kube-api-access-jb4vk\") pod \"nmstate-webhook-5f6d4c5ccb-q9gh9\" (UID: \"9aa1aae1-b774-43f4-9874-3740e3f74a1f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.367813 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-ovs-socket\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.367852 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9aa1aae1-b774-43f4-9874-3740e3f74a1f-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-q9gh9\" (UID: \"9aa1aae1-b774-43f4-9874-3740e3f74a1f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.367884 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb4vk\" (UniqueName: \"kubernetes.io/projected/9aa1aae1-b774-43f4-9874-3740e3f74a1f-kube-api-access-jb4vk\") pod \"nmstate-webhook-5f6d4c5ccb-q9gh9\" (UID: \"9aa1aae1-b774-43f4-9874-3740e3f74a1f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.367919 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-nmstate-lock\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.367947 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cktbc\" (UniqueName: \"kubernetes.io/projected/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-kube-api-access-cktbc\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.367955 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-ovs-socket\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.367975 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-dbus-socket\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.368018 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8dx5\" (UniqueName: \"kubernetes.io/projected/1445cccd-0b52-44ac-9c56-ca17499b0bfb-kube-api-access-t8dx5\") pod \"nmstate-metrics-7f946cbc9-qbz9q\" (UID: \"1445cccd-0b52-44ac-9c56-ca17499b0bfb\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.368167 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-dbus-socket\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.368211 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-nmstate-lock\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: E1203 06:54:25.368335 4475 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 03 06:54:25 crc kubenswrapper[4475]: E1203 06:54:25.368431 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9aa1aae1-b774-43f4-9874-3740e3f74a1f-tls-key-pair podName:9aa1aae1-b774-43f4-9874-3740e3f74a1f nodeName:}" failed. No retries permitted until 2025-12-03 06:54:25.868415305 +0000 UTC m=+550.673313629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/9aa1aae1-b774-43f4-9874-3740e3f74a1f-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-q9gh9" (UID: "9aa1aae1-b774-43f4-9874-3740e3f74a1f") : secret "openshift-nmstate-webhook" not found Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.387513 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cktbc\" (UniqueName: \"kubernetes.io/projected/f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9-kube-api-access-cktbc\") pod \"nmstate-handler-bcvvb\" (UID: \"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9\") " pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.387521 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb4vk\" (UniqueName: \"kubernetes.io/projected/9aa1aae1-b774-43f4-9874-3740e3f74a1f-kube-api-access-jb4vk\") pod \"nmstate-webhook-5f6d4c5ccb-q9gh9\" (UID: \"9aa1aae1-b774-43f4-9874-3740e3f74a1f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.394987 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg"] Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.395645 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.399299 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8dx5\" (UniqueName: \"kubernetes.io/projected/1445cccd-0b52-44ac-9c56-ca17499b0bfb-kube-api-access-t8dx5\") pod \"nmstate-metrics-7f946cbc9-qbz9q\" (UID: \"1445cccd-0b52-44ac-9c56-ca17499b0bfb\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.401420 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-865sf" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.401471 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.401903 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.413895 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg"] Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.469359 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/81cc19b5-f628-4824-8fc6-47eb42683d1a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-hkcsg\" (UID: \"81cc19b5-f628-4824-8fc6-47eb42683d1a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.469430 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmkqh\" (UniqueName: \"kubernetes.io/projected/81cc19b5-f628-4824-8fc6-47eb42683d1a-kube-api-access-vmkqh\") pod \"nmstate-console-plugin-7fbb5f6569-hkcsg\" (UID: \"81cc19b5-f628-4824-8fc6-47eb42683d1a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.469475 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/81cc19b5-f628-4824-8fc6-47eb42683d1a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-hkcsg\" (UID: \"81cc19b5-f628-4824-8fc6-47eb42683d1a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.510997 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.549240 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:25 crc kubenswrapper[4475]: W1203 06:54:25.565082 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f3f3f7_6a4b_4a08_9d41_c2437b3514e9.slice/crio-035cadf41955ac5c1a2d749c1627d97b8255ff12af8a1310eafe65ffe7368cf9 WatchSource:0}: Error finding container 035cadf41955ac5c1a2d749c1627d97b8255ff12af8a1310eafe65ffe7368cf9: Status 404 returned error can't find the container with id 035cadf41955ac5c1a2d749c1627d97b8255ff12af8a1310eafe65ffe7368cf9 Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.570993 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/81cc19b5-f628-4824-8fc6-47eb42683d1a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-hkcsg\" (UID: \"81cc19b5-f628-4824-8fc6-47eb42683d1a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.571045 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmkqh\" (UniqueName: \"kubernetes.io/projected/81cc19b5-f628-4824-8fc6-47eb42683d1a-kube-api-access-vmkqh\") pod \"nmstate-console-plugin-7fbb5f6569-hkcsg\" (UID: \"81cc19b5-f628-4824-8fc6-47eb42683d1a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.571076 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/81cc19b5-f628-4824-8fc6-47eb42683d1a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-hkcsg\" (UID: \"81cc19b5-f628-4824-8fc6-47eb42683d1a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.572034 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/81cc19b5-f628-4824-8fc6-47eb42683d1a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-hkcsg\" (UID: \"81cc19b5-f628-4824-8fc6-47eb42683d1a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.579835 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/81cc19b5-f628-4824-8fc6-47eb42683d1a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-hkcsg\" (UID: \"81cc19b5-f628-4824-8fc6-47eb42683d1a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.584076 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-64fd965765-njz6h"] Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.584697 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.603352 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmkqh\" (UniqueName: \"kubernetes.io/projected/81cc19b5-f628-4824-8fc6-47eb42683d1a-kube-api-access-vmkqh\") pod \"nmstate-console-plugin-7fbb5f6569-hkcsg\" (UID: \"81cc19b5-f628-4824-8fc6-47eb42683d1a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.611115 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64fd965765-njz6h"] Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.672494 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-oauth-serving-cert\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.672559 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-service-ca\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.672590 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0572ddd8-9cff-40b5-a884-430609f6d54d-console-oauth-config\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.672626 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-trusted-ca-bundle\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.672688 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt5fp\" (UniqueName: \"kubernetes.io/projected/0572ddd8-9cff-40b5-a884-430609f6d54d-kube-api-access-nt5fp\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.672703 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0572ddd8-9cff-40b5-a884-430609f6d54d-console-serving-cert\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.672721 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-console-config\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.733673 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.774043 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-trusted-ca-bundle\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.774109 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt5fp\" (UniqueName: \"kubernetes.io/projected/0572ddd8-9cff-40b5-a884-430609f6d54d-kube-api-access-nt5fp\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.774130 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0572ddd8-9cff-40b5-a884-430609f6d54d-console-serving-cert\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.774145 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-console-config\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.774321 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-oauth-serving-cert\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.774362 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-service-ca\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.774396 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0572ddd8-9cff-40b5-a884-430609f6d54d-console-oauth-config\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.775091 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-trusted-ca-bundle\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.775201 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-oauth-serving-cert\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.775281 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-console-config\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.775318 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0572ddd8-9cff-40b5-a884-430609f6d54d-service-ca\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.779115 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0572ddd8-9cff-40b5-a884-430609f6d54d-console-oauth-config\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.779116 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0572ddd8-9cff-40b5-a884-430609f6d54d-console-serving-cert\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.787238 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt5fp\" (UniqueName: \"kubernetes.io/projected/0572ddd8-9cff-40b5-a884-430609f6d54d-kube-api-access-nt5fp\") pod \"console-64fd965765-njz6h\" (UID: \"0572ddd8-9cff-40b5-a884-430609f6d54d\") " pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.875855 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9aa1aae1-b774-43f4-9874-3740e3f74a1f-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-q9gh9\" (UID: \"9aa1aae1-b774-43f4-9874-3740e3f74a1f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.878208 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9aa1aae1-b774-43f4-9874-3740e3f74a1f-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-q9gh9\" (UID: \"9aa1aae1-b774-43f4-9874-3740e3f74a1f\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.911307 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:25 crc kubenswrapper[4475]: I1203 06:54:25.914003 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q"] Dec 03 06:54:26 crc kubenswrapper[4475]: I1203 06:54:26.064048 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg"] Dec 03 06:54:26 crc kubenswrapper[4475]: W1203 06:54:26.067326 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81cc19b5_f628_4824_8fc6_47eb42683d1a.slice/crio-7c06b1b6db8063ccaa85934bdc25ef86b58411998031b44d7118c4fca89298b8 WatchSource:0}: Error finding container 7c06b1b6db8063ccaa85934bdc25ef86b58411998031b44d7118c4fca89298b8: Status 404 returned error can't find the container with id 7c06b1b6db8063ccaa85934bdc25ef86b58411998031b44d7118c4fca89298b8 Dec 03 06:54:26 crc kubenswrapper[4475]: I1203 06:54:26.131290 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" Dec 03 06:54:26 crc kubenswrapper[4475]: I1203 06:54:26.252005 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64fd965765-njz6h"] Dec 03 06:54:26 crc kubenswrapper[4475]: I1203 06:54:26.425609 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" event={"ID":"81cc19b5-f628-4824-8fc6-47eb42683d1a","Type":"ContainerStarted","Data":"7c06b1b6db8063ccaa85934bdc25ef86b58411998031b44d7118c4fca89298b8"} Dec 03 06:54:26 crc kubenswrapper[4475]: I1203 06:54:26.426742 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bcvvb" event={"ID":"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9","Type":"ContainerStarted","Data":"035cadf41955ac5c1a2d749c1627d97b8255ff12af8a1310eafe65ffe7368cf9"} Dec 03 06:54:26 crc kubenswrapper[4475]: I1203 06:54:26.427892 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64fd965765-njz6h" event={"ID":"0572ddd8-9cff-40b5-a884-430609f6d54d","Type":"ContainerStarted","Data":"23124f365c1ff90c3baa77f18a422b449caa834c7f7bef89078eead55e4b55e6"} Dec 03 06:54:26 crc kubenswrapper[4475]: I1203 06:54:26.427918 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64fd965765-njz6h" event={"ID":"0572ddd8-9cff-40b5-a884-430609f6d54d","Type":"ContainerStarted","Data":"577a485d65b7569f5472b2e69905c6a7b4684d22fc688e5b20537bd6b48975f8"} Dec 03 06:54:26 crc kubenswrapper[4475]: I1203 06:54:26.428602 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q" event={"ID":"1445cccd-0b52-44ac-9c56-ca17499b0bfb","Type":"ContainerStarted","Data":"d55c2079ae5073820d719014e882d14647dce5434cf135f7d1b1b98da69eeb26"} Dec 03 06:54:26 crc kubenswrapper[4475]: I1203 06:54:26.461594 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64fd965765-njz6h" podStartSLOduration=1.461577898 podStartE2EDuration="1.461577898s" podCreationTimestamp="2025-12-03 06:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:26.44455629 +0000 UTC m=+551.249454634" watchObservedRunningTime="2025-12-03 06:54:26.461577898 +0000 UTC m=+551.266476233" Dec 03 06:54:26 crc kubenswrapper[4475]: I1203 06:54:26.464035 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9"] Dec 03 06:54:26 crc kubenswrapper[4475]: W1203 06:54:26.469269 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aa1aae1_b774_43f4_9874_3740e3f74a1f.slice/crio-52b3ec93c13903f6188e10a51d2a387fef87cf6e5ec6f751273e45ddad38fe94 WatchSource:0}: Error finding container 52b3ec93c13903f6188e10a51d2a387fef87cf6e5ec6f751273e45ddad38fe94: Status 404 returned error can't find the container with id 52b3ec93c13903f6188e10a51d2a387fef87cf6e5ec6f751273e45ddad38fe94 Dec 03 06:54:27 crc kubenswrapper[4475]: I1203 06:54:27.449162 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" event={"ID":"9aa1aae1-b774-43f4-9874-3740e3f74a1f","Type":"ContainerStarted","Data":"52b3ec93c13903f6188e10a51d2a387fef87cf6e5ec6f751273e45ddad38fe94"} Dec 03 06:54:28 crc kubenswrapper[4475]: I1203 06:54:28.456989 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" event={"ID":"9aa1aae1-b774-43f4-9874-3740e3f74a1f","Type":"ContainerStarted","Data":"d7843deb2949d702167bf59c0fea81a85238605cff6333db5b85b14464a29c7d"} Dec 03 06:54:28 crc kubenswrapper[4475]: I1203 06:54:28.457297 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" Dec 03 06:54:28 crc kubenswrapper[4475]: I1203 06:54:28.458803 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bcvvb" event={"ID":"f6f3f3f7-6a4b-4a08-9d41-c2437b3514e9","Type":"ContainerStarted","Data":"29debae92ae7fc2cc88feba8ffb3f514cc10d57d78748e3d23c95f581e548bbb"} Dec 03 06:54:28 crc kubenswrapper[4475]: I1203 06:54:28.458929 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:28 crc kubenswrapper[4475]: I1203 06:54:28.460529 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q" event={"ID":"1445cccd-0b52-44ac-9c56-ca17499b0bfb","Type":"ContainerStarted","Data":"09fdce9dadcdaa24f79bf09cb3bd102aefba226c07040193240829a3adeb6cc6"} Dec 03 06:54:28 crc kubenswrapper[4475]: I1203 06:54:28.481950 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" podStartSLOduration=2.072525634 podStartE2EDuration="3.481940542s" podCreationTimestamp="2025-12-03 06:54:25 +0000 UTC" firstStartedPulling="2025-12-03 06:54:26.471030775 +0000 UTC m=+551.275929109" lastFinishedPulling="2025-12-03 06:54:27.880445683 +0000 UTC m=+552.685344017" observedRunningTime="2025-12-03 06:54:28.471865776 +0000 UTC m=+553.276764110" watchObservedRunningTime="2025-12-03 06:54:28.481940542 +0000 UTC m=+553.286838876" Dec 03 06:54:28 crc kubenswrapper[4475]: I1203 06:54:28.485291 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bcvvb" podStartSLOduration=1.159835539 podStartE2EDuration="3.48528432s" podCreationTimestamp="2025-12-03 06:54:25 +0000 UTC" firstStartedPulling="2025-12-03 06:54:25.568408251 +0000 UTC m=+550.373306585" lastFinishedPulling="2025-12-03 06:54:27.893857033 +0000 UTC m=+552.698755366" observedRunningTime="2025-12-03 06:54:28.483275071 +0000 UTC m=+553.288173426" watchObservedRunningTime="2025-12-03 06:54:28.48528432 +0000 UTC m=+553.290182654" Dec 03 06:54:29 crc kubenswrapper[4475]: I1203 06:54:29.467336 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" event={"ID":"81cc19b5-f628-4824-8fc6-47eb42683d1a","Type":"ContainerStarted","Data":"22aeffc2ea4a2b501d5da5bec3eb4093b38a25dc203e0c66c9addcb2bd8007f0"} Dec 03 06:54:30 crc kubenswrapper[4475]: I1203 06:54:30.476438 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q" event={"ID":"1445cccd-0b52-44ac-9c56-ca17499b0bfb","Type":"ContainerStarted","Data":"030af2f390d05a864e49bc923204dca33d0bbd0cc094b07532c9753c0c271f80"} Dec 03 06:54:30 crc kubenswrapper[4475]: I1203 06:54:30.489565 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hkcsg" podStartSLOduration=2.831613166 podStartE2EDuration="5.489550244s" podCreationTimestamp="2025-12-03 06:54:25 +0000 UTC" firstStartedPulling="2025-12-03 06:54:26.068467286 +0000 UTC m=+550.873365619" lastFinishedPulling="2025-12-03 06:54:28.726404363 +0000 UTC m=+553.531302697" observedRunningTime="2025-12-03 06:54:29.481150653 +0000 UTC m=+554.286049007" watchObservedRunningTime="2025-12-03 06:54:30.489550244 +0000 UTC m=+555.294448578" Dec 03 06:54:30 crc kubenswrapper[4475]: I1203 06:54:30.491717 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qbz9q" podStartSLOduration=1.6639343389999999 podStartE2EDuration="5.491711337s" podCreationTimestamp="2025-12-03 06:54:25 +0000 UTC" firstStartedPulling="2025-12-03 06:54:25.92371711 +0000 UTC m=+550.728615435" lastFinishedPulling="2025-12-03 06:54:29.7514941 +0000 UTC m=+554.556392433" observedRunningTime="2025-12-03 06:54:30.489141095 +0000 UTC m=+555.294039429" watchObservedRunningTime="2025-12-03 06:54:30.491711337 +0000 UTC m=+555.296609672" Dec 03 06:54:35 crc kubenswrapper[4475]: I1203 06:54:35.572714 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bcvvb" Dec 03 06:54:35 crc kubenswrapper[4475]: I1203 06:54:35.912393 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:35 crc kubenswrapper[4475]: I1203 06:54:35.912643 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:35 crc kubenswrapper[4475]: I1203 06:54:35.916060 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:36 crc kubenswrapper[4475]: I1203 06:54:36.505859 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64fd965765-njz6h" Dec 03 06:54:36 crc kubenswrapper[4475]: I1203 06:54:36.545502 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dbxhk"] Dec 03 06:54:46 crc kubenswrapper[4475]: I1203 06:54:46.140065 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q9gh9" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.054847 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d"] Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.056052 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.057578 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.065614 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d"] Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.205900 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qjzb\" (UniqueName: \"kubernetes.io/projected/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-kube-api-access-6qjzb\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.205976 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.206029 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.307398 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.307445 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.307510 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qjzb\" (UniqueName: \"kubernetes.io/projected/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-kube-api-access-6qjzb\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.308000 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.308003 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.323007 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qjzb\" (UniqueName: \"kubernetes.io/projected/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-kube-api-access-6qjzb\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.367524 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:54:55 crc kubenswrapper[4475]: I1203 06:54:55.705246 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d"] Dec 03 06:54:55 crc kubenswrapper[4475]: W1203 06:54:55.708058 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9a3a5f0_6c30_4bd1_b5a9_b84e2a116073.slice/crio-32727726db862ffbb67282d4663cbc642c05d59aeab92c95c0a3c7e238a33b24 WatchSource:0}: Error finding container 32727726db862ffbb67282d4663cbc642c05d59aeab92c95c0a3c7e238a33b24: Status 404 returned error can't find the container with id 32727726db862ffbb67282d4663cbc642c05d59aeab92c95c0a3c7e238a33b24 Dec 03 06:54:56 crc kubenswrapper[4475]: I1203 06:54:56.603040 4475 generic.go:334] "Generic (PLEG): container finished" podID="f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" containerID="9381f1bd32135fe1dd24ad14adc9677f4c8c5151ddb35e3db54670fb35efd550" exitCode=0 Dec 03 06:54:56 crc kubenswrapper[4475]: I1203 06:54:56.603078 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" event={"ID":"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073","Type":"ContainerDied","Data":"9381f1bd32135fe1dd24ad14adc9677f4c8c5151ddb35e3db54670fb35efd550"} Dec 03 06:54:56 crc kubenswrapper[4475]: I1203 06:54:56.603216 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" event={"ID":"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073","Type":"ContainerStarted","Data":"32727726db862ffbb67282d4663cbc642c05d59aeab92c95c0a3c7e238a33b24"} Dec 03 06:54:58 crc kubenswrapper[4475]: I1203 06:54:58.612576 4475 generic.go:334] "Generic (PLEG): container finished" podID="f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" containerID="472f75e3bbedb2b873df458cdf9ba30f5fa6dea895aa353a391e2f66d94d045a" exitCode=0 Dec 03 06:54:58 crc kubenswrapper[4475]: I1203 06:54:58.612661 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" event={"ID":"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073","Type":"ContainerDied","Data":"472f75e3bbedb2b873df458cdf9ba30f5fa6dea895aa353a391e2f66d94d045a"} Dec 03 06:54:58 crc kubenswrapper[4475]: I1203 06:54:58.933381 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:54:58 crc kubenswrapper[4475]: I1203 06:54:58.933434 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:54:59 crc kubenswrapper[4475]: I1203 06:54:59.618912 4475 generic.go:334] "Generic (PLEG): container finished" podID="f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" containerID="cfef8798b84d4601918cd700e23766d64813129a6f93d2a392a36fb800850ebd" exitCode=0 Dec 03 06:54:59 crc kubenswrapper[4475]: I1203 06:54:59.618946 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" event={"ID":"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073","Type":"ContainerDied","Data":"cfef8798b84d4601918cd700e23766d64813129a6f93d2a392a36fb800850ebd"} Dec 03 06:55:00 crc kubenswrapper[4475]: I1203 06:55:00.781462 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:55:00 crc kubenswrapper[4475]: I1203 06:55:00.866764 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-util\") pod \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " Dec 03 06:55:00 crc kubenswrapper[4475]: I1203 06:55:00.866794 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-bundle\") pod \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " Dec 03 06:55:00 crc kubenswrapper[4475]: I1203 06:55:00.866833 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qjzb\" (UniqueName: \"kubernetes.io/projected/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-kube-api-access-6qjzb\") pod \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\" (UID: \"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073\") " Dec 03 06:55:00 crc kubenswrapper[4475]: I1203 06:55:00.867620 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-bundle" (OuterVolumeSpecName: "bundle") pod "f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" (UID: "f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:00 crc kubenswrapper[4475]: I1203 06:55:00.870627 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-kube-api-access-6qjzb" (OuterVolumeSpecName: "kube-api-access-6qjzb") pod "f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" (UID: "f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073"). InnerVolumeSpecName "kube-api-access-6qjzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:00 crc kubenswrapper[4475]: I1203 06:55:00.876690 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-util" (OuterVolumeSpecName: "util") pod "f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" (UID: "f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:00 crc kubenswrapper[4475]: I1203 06:55:00.968319 4475 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-util\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:00 crc kubenswrapper[4475]: I1203 06:55:00.968343 4475 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:00 crc kubenswrapper[4475]: I1203 06:55:00.968352 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qjzb\" (UniqueName: \"kubernetes.io/projected/f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073-kube-api-access-6qjzb\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.573713 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-dbxhk" podUID="09928a8e-a70b-4916-9ae2-4dbe952aa514" containerName="console" containerID="cri-o://ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157" gracePeriod=15 Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.637785 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" event={"ID":"f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073","Type":"ContainerDied","Data":"32727726db862ffbb67282d4663cbc642c05d59aeab92c95c0a3c7e238a33b24"} Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.637831 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32727726db862ffbb67282d4663cbc642c05d59aeab92c95c0a3c7e238a33b24" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.637849 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zx54d" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.861906 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dbxhk_09928a8e-a70b-4916-9ae2-4dbe952aa514/console/0.log" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.861969 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.877490 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-serving-cert\") pod \"09928a8e-a70b-4916-9ae2-4dbe952aa514\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.877524 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8smn2\" (UniqueName: \"kubernetes.io/projected/09928a8e-a70b-4916-9ae2-4dbe952aa514-kube-api-access-8smn2\") pod \"09928a8e-a70b-4916-9ae2-4dbe952aa514\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.877561 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-config\") pod \"09928a8e-a70b-4916-9ae2-4dbe952aa514\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.877607 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-service-ca\") pod \"09928a8e-a70b-4916-9ae2-4dbe952aa514\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.877621 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-oauth-config\") pod \"09928a8e-a70b-4916-9ae2-4dbe952aa514\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.877708 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-oauth-serving-cert\") pod \"09928a8e-a70b-4916-9ae2-4dbe952aa514\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.877748 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-trusted-ca-bundle\") pod \"09928a8e-a70b-4916-9ae2-4dbe952aa514\" (UID: \"09928a8e-a70b-4916-9ae2-4dbe952aa514\") " Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.878518 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09928a8e-a70b-4916-9ae2-4dbe952aa514" (UID: "09928a8e-a70b-4916-9ae2-4dbe952aa514"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.878560 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-config" (OuterVolumeSpecName: "console-config") pod "09928a8e-a70b-4916-9ae2-4dbe952aa514" (UID: "09928a8e-a70b-4916-9ae2-4dbe952aa514"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.878953 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-service-ca" (OuterVolumeSpecName: "service-ca") pod "09928a8e-a70b-4916-9ae2-4dbe952aa514" (UID: "09928a8e-a70b-4916-9ae2-4dbe952aa514"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.879401 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "09928a8e-a70b-4916-9ae2-4dbe952aa514" (UID: "09928a8e-a70b-4916-9ae2-4dbe952aa514"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.883828 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "09928a8e-a70b-4916-9ae2-4dbe952aa514" (UID: "09928a8e-a70b-4916-9ae2-4dbe952aa514"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.884009 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "09928a8e-a70b-4916-9ae2-4dbe952aa514" (UID: "09928a8e-a70b-4916-9ae2-4dbe952aa514"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.886244 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09928a8e-a70b-4916-9ae2-4dbe952aa514-kube-api-access-8smn2" (OuterVolumeSpecName: "kube-api-access-8smn2") pod "09928a8e-a70b-4916-9ae2-4dbe952aa514" (UID: "09928a8e-a70b-4916-9ae2-4dbe952aa514"). InnerVolumeSpecName "kube-api-access-8smn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.978980 4475 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.979004 4475 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.979014 4475 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.979023 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8smn2\" (UniqueName: \"kubernetes.io/projected/09928a8e-a70b-4916-9ae2-4dbe952aa514-kube-api-access-8smn2\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.979033 4475 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.979042 4475 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09928a8e-a70b-4916-9ae2-4dbe952aa514-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:01 crc kubenswrapper[4475]: I1203 06:55:01.979050 4475 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09928a8e-a70b-4916-9ae2-4dbe952aa514-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:02 crc kubenswrapper[4475]: I1203 06:55:02.643471 4475 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dbxhk_09928a8e-a70b-4916-9ae2-4dbe952aa514/console/0.log" Dec 03 06:55:02 crc kubenswrapper[4475]: I1203 06:55:02.643514 4475 generic.go:334] "Generic (PLEG): container finished" podID="09928a8e-a70b-4916-9ae2-4dbe952aa514" containerID="ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157" exitCode=2 Dec 03 06:55:02 crc kubenswrapper[4475]: I1203 06:55:02.643542 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dbxhk" event={"ID":"09928a8e-a70b-4916-9ae2-4dbe952aa514","Type":"ContainerDied","Data":"ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157"} Dec 03 06:55:02 crc kubenswrapper[4475]: I1203 06:55:02.643571 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dbxhk" event={"ID":"09928a8e-a70b-4916-9ae2-4dbe952aa514","Type":"ContainerDied","Data":"737f45ed55d673b657d03e384499671b6d6c3703c989815d081e556197f7fb49"} Dec 03 06:55:02 crc kubenswrapper[4475]: I1203 06:55:02.643582 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dbxhk" Dec 03 06:55:02 crc kubenswrapper[4475]: I1203 06:55:02.643590 4475 scope.go:117] "RemoveContainer" containerID="ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157" Dec 03 06:55:02 crc kubenswrapper[4475]: I1203 06:55:02.655857 4475 scope.go:117] "RemoveContainer" containerID="ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157" Dec 03 06:55:02 crc kubenswrapper[4475]: E1203 06:55:02.656145 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157\": container with ID starting with ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157 not found: ID does not exist" containerID="ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157" Dec 03 06:55:02 crc kubenswrapper[4475]: I1203 06:55:02.656177 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157"} err="failed to get container status \"ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157\": rpc error: code = NotFound desc = could not find container \"ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157\": container with ID starting with ff221eab7072c1a47c65c1bf0b2f037bea028f49986757610b09d9f683dd4157 not found: ID does not exist" Dec 03 06:55:02 crc kubenswrapper[4475]: I1203 06:55:02.663834 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dbxhk"] Dec 03 06:55:02 crc kubenswrapper[4475]: I1203 06:55:02.666476 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-dbxhk"] Dec 03 06:55:03 crc kubenswrapper[4475]: I1203 06:55:03.496920 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09928a8e-a70b-4916-9ae2-4dbe952aa514" path="/var/lib/kubelet/pods/09928a8e-a70b-4916-9ae2-4dbe952aa514/volumes" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.702429 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4"] Dec 03 06:55:10 crc kubenswrapper[4475]: E1203 06:55:10.703066 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09928a8e-a70b-4916-9ae2-4dbe952aa514" containerName="console" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.703079 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="09928a8e-a70b-4916-9ae2-4dbe952aa514" containerName="console" Dec 03 06:55:10 crc kubenswrapper[4475]: E1203 06:55:10.703096 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" containerName="extract" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.703102 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" containerName="extract" Dec 03 06:55:10 crc kubenswrapper[4475]: E1203 06:55:10.703111 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" containerName="pull" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.703117 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" containerName="pull" Dec 03 06:55:10 crc kubenswrapper[4475]: E1203 06:55:10.703136 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" containerName="util" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.703141 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" containerName="util" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.703307 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a3a5f0-6c30-4bd1-b5a9-b84e2a116073" containerName="extract" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.703315 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="09928a8e-a70b-4916-9ae2-4dbe952aa514" containerName="console" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.703795 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.709404 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.709590 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.711032 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.711257 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.711489 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-6txrv" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.721096 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4"] Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.774226 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1-webhook-cert\") pod \"metallb-operator-controller-manager-cccf8fc64-b9db4\" (UID: \"7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1\") " pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.774284 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1-apiservice-cert\") pod \"metallb-operator-controller-manager-cccf8fc64-b9db4\" (UID: \"7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1\") " pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.774412 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv66v\" (UniqueName: \"kubernetes.io/projected/7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1-kube-api-access-jv66v\") pod \"metallb-operator-controller-manager-cccf8fc64-b9db4\" (UID: \"7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1\") " pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.875339 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv66v\" (UniqueName: \"kubernetes.io/projected/7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1-kube-api-access-jv66v\") pod \"metallb-operator-controller-manager-cccf8fc64-b9db4\" (UID: \"7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1\") " pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.875398 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1-webhook-cert\") pod \"metallb-operator-controller-manager-cccf8fc64-b9db4\" (UID: \"7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1\") " pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.875432 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1-apiservice-cert\") pod \"metallb-operator-controller-manager-cccf8fc64-b9db4\" (UID: \"7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1\") " pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.882139 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1-webhook-cert\") pod \"metallb-operator-controller-manager-cccf8fc64-b9db4\" (UID: \"7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1\") " pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.882141 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1-apiservice-cert\") pod \"metallb-operator-controller-manager-cccf8fc64-b9db4\" (UID: \"7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1\") " pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.899180 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv66v\" (UniqueName: \"kubernetes.io/projected/7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1-kube-api-access-jv66v\") pod \"metallb-operator-controller-manager-cccf8fc64-b9db4\" (UID: \"7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1\") " pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.960978 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6"] Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.961572 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.964615 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.964793 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-mtws5" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.965155 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.975934 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/811c7050-6da5-49f0-b71d-0a7ee6502122-apiservice-cert\") pod \"metallb-operator-webhook-server-7bd4dd88cc-tc6l6\" (UID: \"811c7050-6da5-49f0-b71d-0a7ee6502122\") " pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.975971 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm2pb\" (UniqueName: \"kubernetes.io/projected/811c7050-6da5-49f0-b71d-0a7ee6502122-kube-api-access-vm2pb\") pod \"metallb-operator-webhook-server-7bd4dd88cc-tc6l6\" (UID: \"811c7050-6da5-49f0-b71d-0a7ee6502122\") " pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.975995 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/811c7050-6da5-49f0-b71d-0a7ee6502122-webhook-cert\") pod \"metallb-operator-webhook-server-7bd4dd88cc-tc6l6\" (UID: \"811c7050-6da5-49f0-b71d-0a7ee6502122\") " pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:10 crc kubenswrapper[4475]: I1203 06:55:10.981267 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6"] Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.033982 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.076704 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm2pb\" (UniqueName: \"kubernetes.io/projected/811c7050-6da5-49f0-b71d-0a7ee6502122-kube-api-access-vm2pb\") pod \"metallb-operator-webhook-server-7bd4dd88cc-tc6l6\" (UID: \"811c7050-6da5-49f0-b71d-0a7ee6502122\") " pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.076924 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/811c7050-6da5-49f0-b71d-0a7ee6502122-webhook-cert\") pod \"metallb-operator-webhook-server-7bd4dd88cc-tc6l6\" (UID: \"811c7050-6da5-49f0-b71d-0a7ee6502122\") " pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.076990 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/811c7050-6da5-49f0-b71d-0a7ee6502122-apiservice-cert\") pod \"metallb-operator-webhook-server-7bd4dd88cc-tc6l6\" (UID: \"811c7050-6da5-49f0-b71d-0a7ee6502122\") " pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.082984 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/811c7050-6da5-49f0-b71d-0a7ee6502122-apiservice-cert\") pod \"metallb-operator-webhook-server-7bd4dd88cc-tc6l6\" (UID: \"811c7050-6da5-49f0-b71d-0a7ee6502122\") " pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.083037 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/811c7050-6da5-49f0-b71d-0a7ee6502122-webhook-cert\") pod \"metallb-operator-webhook-server-7bd4dd88cc-tc6l6\" (UID: \"811c7050-6da5-49f0-b71d-0a7ee6502122\") " pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.090281 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm2pb\" (UniqueName: \"kubernetes.io/projected/811c7050-6da5-49f0-b71d-0a7ee6502122-kube-api-access-vm2pb\") pod \"metallb-operator-webhook-server-7bd4dd88cc-tc6l6\" (UID: \"811c7050-6da5-49f0-b71d-0a7ee6502122\") " pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.259272 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4"] Dec 03 06:55:11 crc kubenswrapper[4475]: W1203 06:55:11.268904 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d2b1eeb_c334_47d6_a9a2_6f4647aa1fd1.slice/crio-51807803ea1714ed3d55db1500df5982153fade2e7ab6fcc37fbc1edaf6305d2 WatchSource:0}: Error finding container 51807803ea1714ed3d55db1500df5982153fade2e7ab6fcc37fbc1edaf6305d2: Status 404 returned error can't find the container with id 51807803ea1714ed3d55db1500df5982153fade2e7ab6fcc37fbc1edaf6305d2 Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.272139 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.440839 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6"] Dec 03 06:55:11 crc kubenswrapper[4475]: W1203 06:55:11.451282 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod811c7050_6da5_49f0_b71d_0a7ee6502122.slice/crio-7c6df01bef306f739504b2eab0398f0a8f48cbc71c6ee1ee8bd28252b1aacfd1 WatchSource:0}: Error finding container 7c6df01bef306f739504b2eab0398f0a8f48cbc71c6ee1ee8bd28252b1aacfd1: Status 404 returned error can't find the container with id 7c6df01bef306f739504b2eab0398f0a8f48cbc71c6ee1ee8bd28252b1aacfd1 Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.703260 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" event={"ID":"811c7050-6da5-49f0-b71d-0a7ee6502122","Type":"ContainerStarted","Data":"7c6df01bef306f739504b2eab0398f0a8f48cbc71c6ee1ee8bd28252b1aacfd1"} Dec 03 06:55:11 crc kubenswrapper[4475]: I1203 06:55:11.704583 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" event={"ID":"7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1","Type":"ContainerStarted","Data":"51807803ea1714ed3d55db1500df5982153fade2e7ab6fcc37fbc1edaf6305d2"} Dec 03 06:55:14 crc kubenswrapper[4475]: I1203 06:55:14.726244 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" event={"ID":"7d2b1eeb-c334-47d6-a9a2-6f4647aa1fd1","Type":"ContainerStarted","Data":"acfc9a4e89dba5c1131dccaf77f3f15f230391c27f11d480408d2a5f9c728792"} Dec 03 06:55:14 crc kubenswrapper[4475]: I1203 06:55:14.726530 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:14 crc kubenswrapper[4475]: I1203 06:55:14.746756 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" podStartSLOduration=2.38811879 podStartE2EDuration="4.746739457s" podCreationTimestamp="2025-12-03 06:55:10 +0000 UTC" firstStartedPulling="2025-12-03 06:55:11.273115653 +0000 UTC m=+596.078013988" lastFinishedPulling="2025-12-03 06:55:13.631736321 +0000 UTC m=+598.436634655" observedRunningTime="2025-12-03 06:55:14.739414322 +0000 UTC m=+599.544312656" watchObservedRunningTime="2025-12-03 06:55:14.746739457 +0000 UTC m=+599.551637782" Dec 03 06:55:15 crc kubenswrapper[4475]: I1203 06:55:15.732129 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" event={"ID":"811c7050-6da5-49f0-b71d-0a7ee6502122","Type":"ContainerStarted","Data":"fa372005eb5557d53c674fad071279913dba2866d33b016935f9b3f3e027a034"} Dec 03 06:55:15 crc kubenswrapper[4475]: I1203 06:55:15.732487 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:28 crc kubenswrapper[4475]: I1203 06:55:28.933847 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:55:28 crc kubenswrapper[4475]: I1203 06:55:28.934205 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:55:31 crc kubenswrapper[4475]: I1203 06:55:31.276041 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" Dec 03 06:55:31 crc kubenswrapper[4475]: I1203 06:55:31.294167 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7bd4dd88cc-tc6l6" podStartSLOduration=17.84076005 podStartE2EDuration="21.29415242s" podCreationTimestamp="2025-12-03 06:55:10 +0000 UTC" firstStartedPulling="2025-12-03 06:55:11.458049196 +0000 UTC m=+596.262947530" lastFinishedPulling="2025-12-03 06:55:14.911441565 +0000 UTC m=+599.716339900" observedRunningTime="2025-12-03 06:55:15.75532619 +0000 UTC m=+600.560224524" watchObservedRunningTime="2025-12-03 06:55:31.29415242 +0000 UTC m=+616.099050754" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.036264 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-cccf8fc64-b9db4" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.539461 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn"] Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.540127 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.542465 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-cl9w7"] Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.544178 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.547635 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.547754 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.547870 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-z22jt" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.547937 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.555470 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn"] Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.609261 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-w6hc2"] Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.609979 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w6hc2" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.612718 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.612965 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.613081 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-8cwfz" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.619811 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621445 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-reloader\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621497 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d28e2e6-5871-45b6-8e27-5e1f20df1ead-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5fqhn\" (UID: \"3d28e2e6-5871-45b6-8e27-5e1f20df1ead\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621517 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-frr-conf\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621536 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-frr-startup\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621550 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-frr-sockets\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621593 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-metrics-certs\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621607 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8758b06a-2c57-4889-8611-ec84a0fed8eb-metallb-excludel2\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621625 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-memberlist\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621640 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kltj\" (UniqueName: \"kubernetes.io/projected/3d28e2e6-5871-45b6-8e27-5e1f20df1ead-kube-api-access-6kltj\") pod \"frr-k8s-webhook-server-7fcb986d4-5fqhn\" (UID: \"3d28e2e6-5871-45b6-8e27-5e1f20df1ead\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621711 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-metrics-certs\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621734 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6sg\" (UniqueName: \"kubernetes.io/projected/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-kube-api-access-2k6sg\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621757 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-metrics\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.621780 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg5s2\" (UniqueName: \"kubernetes.io/projected/8758b06a-2c57-4889-8611-ec84a0fed8eb-kube-api-access-gg5s2\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.629383 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-9vqjd"] Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.630225 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.635287 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.643537 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-9vqjd"] Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.722551 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-metrics-certs\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.722775 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6sg\" (UniqueName: \"kubernetes.io/projected/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-kube-api-access-2k6sg\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.722799 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-cert\") pod \"controller-f8648f98b-9vqjd\" (UID: \"3f0b0024-0ac9-4ff6-bbdf-f8409695a684\") " pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.722828 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-metrics\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: E1203 06:55:51.722712 4475 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.722857 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg5s2\" (UniqueName: \"kubernetes.io/projected/8758b06a-2c57-4889-8611-ec84a0fed8eb-kube-api-access-gg5s2\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:51 crc kubenswrapper[4475]: E1203 06:55:51.722911 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-metrics-certs podName:4b61aba1-4e8e-4494-8ecd-3f082e5db4b3 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:52.222890539 +0000 UTC m=+637.027788874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-metrics-certs") pod "frr-k8s-cl9w7" (UID: "4b61aba1-4e8e-4494-8ecd-3f082e5db4b3") : secret "frr-k8s-certs-secret" not found Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.722928 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-reloader\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.722959 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d28e2e6-5871-45b6-8e27-5e1f20df1ead-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5fqhn\" (UID: \"3d28e2e6-5871-45b6-8e27-5e1f20df1ead\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.722986 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-frr-conf\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723009 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-frr-startup\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723022 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-frr-sockets\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723064 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-metrics-certs\") pod \"controller-f8648f98b-9vqjd\" (UID: \"3f0b0024-0ac9-4ff6-bbdf-f8409695a684\") " pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723079 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-metrics-certs\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723093 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8758b06a-2c57-4889-8611-ec84a0fed8eb-metallb-excludel2\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723112 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnzxd\" (UniqueName: \"kubernetes.io/projected/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-kube-api-access-xnzxd\") pod \"controller-f8648f98b-9vqjd\" (UID: \"3f0b0024-0ac9-4ff6-bbdf-f8409695a684\") " pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723133 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-memberlist\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723148 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kltj\" (UniqueName: \"kubernetes.io/projected/3d28e2e6-5871-45b6-8e27-5e1f20df1ead-kube-api-access-6kltj\") pod \"frr-k8s-webhook-server-7fcb986d4-5fqhn\" (UID: \"3d28e2e6-5871-45b6-8e27-5e1f20df1ead\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723298 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-metrics\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723564 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-reloader\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723595 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-frr-sockets\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: E1203 06:55:51.723628 4475 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 06:55:51 crc kubenswrapper[4475]: E1203 06:55:51.723647 4475 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 03 06:55:51 crc kubenswrapper[4475]: E1203 06:55:51.723654 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-memberlist podName:8758b06a-2c57-4889-8611-ec84a0fed8eb nodeName:}" failed. No retries permitted until 2025-12-03 06:55:52.223644407 +0000 UTC m=+637.028542741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-memberlist") pod "speaker-w6hc2" (UID: "8758b06a-2c57-4889-8611-ec84a0fed8eb") : secret "metallb-memberlist" not found Dec 03 06:55:51 crc kubenswrapper[4475]: E1203 06:55:51.723671 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-metrics-certs podName:8758b06a-2c57-4889-8611-ec84a0fed8eb nodeName:}" failed. No retries permitted until 2025-12-03 06:55:52.223663704 +0000 UTC m=+637.028562037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-metrics-certs") pod "speaker-w6hc2" (UID: "8758b06a-2c57-4889-8611-ec84a0fed8eb") : secret "speaker-certs-secret" not found Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.723807 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-frr-conf\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.724145 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-frr-startup\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.724167 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8758b06a-2c57-4889-8611-ec84a0fed8eb-metallb-excludel2\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.737020 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d28e2e6-5871-45b6-8e27-5e1f20df1ead-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5fqhn\" (UID: \"3d28e2e6-5871-45b6-8e27-5e1f20df1ead\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.737044 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6sg\" (UniqueName: \"kubernetes.io/projected/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-kube-api-access-2k6sg\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.739834 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kltj\" (UniqueName: \"kubernetes.io/projected/3d28e2e6-5871-45b6-8e27-5e1f20df1ead-kube-api-access-6kltj\") pod \"frr-k8s-webhook-server-7fcb986d4-5fqhn\" (UID: \"3d28e2e6-5871-45b6-8e27-5e1f20df1ead\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.742904 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg5s2\" (UniqueName: \"kubernetes.io/projected/8758b06a-2c57-4889-8611-ec84a0fed8eb-kube-api-access-gg5s2\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.824877 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-cert\") pod \"controller-f8648f98b-9vqjd\" (UID: \"3f0b0024-0ac9-4ff6-bbdf-f8409695a684\") " pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.824966 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-metrics-certs\") pod \"controller-f8648f98b-9vqjd\" (UID: \"3f0b0024-0ac9-4ff6-bbdf-f8409695a684\") " pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.824992 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnzxd\" (UniqueName: \"kubernetes.io/projected/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-kube-api-access-xnzxd\") pod \"controller-f8648f98b-9vqjd\" (UID: \"3f0b0024-0ac9-4ff6-bbdf-f8409695a684\") " pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:51 crc kubenswrapper[4475]: E1203 06:55:51.825154 4475 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 03 06:55:51 crc kubenswrapper[4475]: E1203 06:55:51.825222 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-metrics-certs podName:3f0b0024-0ac9-4ff6-bbdf-f8409695a684 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:52.325206492 +0000 UTC m=+637.130104826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-metrics-certs") pod "controller-f8648f98b-9vqjd" (UID: "3f0b0024-0ac9-4ff6-bbdf-f8409695a684") : secret "controller-certs-secret" not found Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.826363 4475 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.837831 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-cert\") pod \"controller-f8648f98b-9vqjd\" (UID: \"3f0b0024-0ac9-4ff6-bbdf-f8409695a684\") " pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.838328 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnzxd\" (UniqueName: \"kubernetes.io/projected/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-kube-api-access-xnzxd\") pod \"controller-f8648f98b-9vqjd\" (UID: \"3f0b0024-0ac9-4ff6-bbdf-f8409695a684\") " pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:51 crc kubenswrapper[4475]: I1203 06:55:51.856214 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.213325 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn"] Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.230425 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-metrics-certs\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.230484 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-memberlist\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.230527 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-metrics-certs\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:52 crc kubenswrapper[4475]: E1203 06:55:52.231124 4475 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 06:55:52 crc kubenswrapper[4475]: E1203 06:55:52.231186 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-memberlist podName:8758b06a-2c57-4889-8611-ec84a0fed8eb nodeName:}" failed. No retries permitted until 2025-12-03 06:55:53.231171528 +0000 UTC m=+638.036069863 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-memberlist") pod "speaker-w6hc2" (UID: "8758b06a-2c57-4889-8611-ec84a0fed8eb") : secret "metallb-memberlist" not found Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.233103 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b61aba1-4e8e-4494-8ecd-3f082e5db4b3-metrics-certs\") pod \"frr-k8s-cl9w7\" (UID: \"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3\") " pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.233623 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-metrics-certs\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.331931 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-metrics-certs\") pod \"controller-f8648f98b-9vqjd\" (UID: \"3f0b0024-0ac9-4ff6-bbdf-f8409695a684\") " pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.334592 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f0b0024-0ac9-4ff6-bbdf-f8409695a684-metrics-certs\") pod \"controller-f8648f98b-9vqjd\" (UID: \"3f0b0024-0ac9-4ff6-bbdf-f8409695a684\") " pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.463641 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.539337 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.874027 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-9vqjd"] Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.877858 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" event={"ID":"3d28e2e6-5871-45b6-8e27-5e1f20df1ead","Type":"ContainerStarted","Data":"1d6d36c46d1c8c36d37ec357c24460507f90fbe1c210dd49e58e2ba9e3575e92"} Dec 03 06:55:52 crc kubenswrapper[4475]: I1203 06:55:52.879486 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cl9w7" event={"ID":"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3","Type":"ContainerStarted","Data":"de039db3392ddd605add0098fbf6df1fbb2be51572d439212b5d268449f866d6"} Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.245076 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-memberlist\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.257958 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8758b06a-2c57-4889-8611-ec84a0fed8eb-memberlist\") pod \"speaker-w6hc2\" (UID: \"8758b06a-2c57-4889-8611-ec84a0fed8eb\") " pod="metallb-system/speaker-w6hc2" Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.422585 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w6hc2" Dec 03 06:55:53 crc kubenswrapper[4475]: W1203 06:55:53.437682 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8758b06a_2c57_4889_8611_ec84a0fed8eb.slice/crio-d080543d77192a351e581ebf7f68b0e2b0048674c88a4627615b3bc65fc7f84c WatchSource:0}: Error finding container d080543d77192a351e581ebf7f68b0e2b0048674c88a4627615b3bc65fc7f84c: Status 404 returned error can't find the container with id d080543d77192a351e581ebf7f68b0e2b0048674c88a4627615b3bc65fc7f84c Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.893714 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w6hc2" event={"ID":"8758b06a-2c57-4889-8611-ec84a0fed8eb","Type":"ContainerStarted","Data":"99b769e95315ac567c906593d2555e635c418aecc0dd954abb44758d39c462d4"} Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.893924 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w6hc2" event={"ID":"8758b06a-2c57-4889-8611-ec84a0fed8eb","Type":"ContainerStarted","Data":"21d47a4a536be3b4741c9ebd9c2f3f146101da6ad536271cfc63d28361a96586"} Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.893989 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w6hc2" event={"ID":"8758b06a-2c57-4889-8611-ec84a0fed8eb","Type":"ContainerStarted","Data":"d080543d77192a351e581ebf7f68b0e2b0048674c88a4627615b3bc65fc7f84c"} Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.894932 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-w6hc2" Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.897659 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-9vqjd" event={"ID":"3f0b0024-0ac9-4ff6-bbdf-f8409695a684","Type":"ContainerStarted","Data":"eec19534bd4297a3333b972d7084eb579cc07e4c9aaa0ab3c225f843aff5bbb3"} Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.897694 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-9vqjd" event={"ID":"3f0b0024-0ac9-4ff6-bbdf-f8409695a684","Type":"ContainerStarted","Data":"36e2be903344ece89571fe71971b7ace67c83a6785fa83005e6f6e6f76710d31"} Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.897707 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-9vqjd" event={"ID":"3f0b0024-0ac9-4ff6-bbdf-f8409695a684","Type":"ContainerStarted","Data":"339ef16e2b279709ab553f9f321a6c4e387571be1f49db9aabd04cf8ae48c1ba"} Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.897918 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.927393 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-w6hc2" podStartSLOduration=2.9273790159999997 podStartE2EDuration="2.927379016s" podCreationTimestamp="2025-12-03 06:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:55:53.910977233 +0000 UTC m=+638.715875567" watchObservedRunningTime="2025-12-03 06:55:53.927379016 +0000 UTC m=+638.732277349" Dec 03 06:55:53 crc kubenswrapper[4475]: I1203 06:55:53.928199 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-9vqjd" podStartSLOduration=2.928193988 podStartE2EDuration="2.928193988s" podCreationTimestamp="2025-12-03 06:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:55:53.923213202 +0000 UTC m=+638.728111536" watchObservedRunningTime="2025-12-03 06:55:53.928193988 +0000 UTC m=+638.733092321" Dec 03 06:55:58 crc kubenswrapper[4475]: I1203 06:55:58.933012 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:55:58 crc kubenswrapper[4475]: I1203 06:55:58.933215 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:55:58 crc kubenswrapper[4475]: I1203 06:55:58.933261 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:55:58 crc kubenswrapper[4475]: I1203 06:55:58.933805 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a13f575406937575ec2819856647e11d9c4ccb5a9ec17bf4568eec9af01a7ba"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:55:58 crc kubenswrapper[4475]: I1203 06:55:58.933864 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://0a13f575406937575ec2819856647e11d9c4ccb5a9ec17bf4568eec9af01a7ba" gracePeriod=600 Dec 03 06:55:58 crc kubenswrapper[4475]: I1203 06:55:58.935653 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" event={"ID":"3d28e2e6-5871-45b6-8e27-5e1f20df1ead","Type":"ContainerStarted","Data":"0919aa2c7e31b5e1464456f1cbc55ada55c3fd8212af73a9d4c02cf7e23540d4"} Dec 03 06:55:58 crc kubenswrapper[4475]: I1203 06:55:58.935782 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" Dec 03 06:55:58 crc kubenswrapper[4475]: I1203 06:55:58.937657 4475 generic.go:334] "Generic (PLEG): container finished" podID="4b61aba1-4e8e-4494-8ecd-3f082e5db4b3" containerID="8a23472daecafeed30a2d6dc1f3a01f571367f45786f8df1b043f8cf5ff541e3" exitCode=0 Dec 03 06:55:58 crc kubenswrapper[4475]: I1203 06:55:58.937737 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cl9w7" event={"ID":"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3","Type":"ContainerDied","Data":"8a23472daecafeed30a2d6dc1f3a01f571367f45786f8df1b043f8cf5ff541e3"} Dec 03 06:55:58 crc kubenswrapper[4475]: I1203 06:55:58.949531 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" podStartSLOduration=1.39623101 podStartE2EDuration="7.949504809s" podCreationTimestamp="2025-12-03 06:55:51 +0000 UTC" firstStartedPulling="2025-12-03 06:55:52.221976278 +0000 UTC m=+637.026874612" lastFinishedPulling="2025-12-03 06:55:58.775250078 +0000 UTC m=+643.580148411" observedRunningTime="2025-12-03 06:55:58.94750603 +0000 UTC m=+643.752404355" watchObservedRunningTime="2025-12-03 06:55:58.949504809 +0000 UTC m=+643.754403144" Dec 03 06:55:59 crc kubenswrapper[4475]: I1203 06:55:59.945489 4475 generic.go:334] "Generic (PLEG): container finished" podID="4b61aba1-4e8e-4494-8ecd-3f082e5db4b3" containerID="ff6c5f78c6dddb2d0ae68d09f07d87d7341edcee5e411b6bf9296cc47cbe6ed6" exitCode=0 Dec 03 06:55:59 crc kubenswrapper[4475]: I1203 06:55:59.945550 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cl9w7" event={"ID":"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3","Type":"ContainerDied","Data":"ff6c5f78c6dddb2d0ae68d09f07d87d7341edcee5e411b6bf9296cc47cbe6ed6"} Dec 03 06:55:59 crc kubenswrapper[4475]: I1203 06:55:59.948716 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="0a13f575406937575ec2819856647e11d9c4ccb5a9ec17bf4568eec9af01a7ba" exitCode=0 Dec 03 06:55:59 crc kubenswrapper[4475]: I1203 06:55:59.948792 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"0a13f575406937575ec2819856647e11d9c4ccb5a9ec17bf4568eec9af01a7ba"} Dec 03 06:55:59 crc kubenswrapper[4475]: I1203 06:55:59.948835 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"9e442459db76920abc97188abe20663d3f8869ff7e3f567458064e516a3ad52c"} Dec 03 06:55:59 crc kubenswrapper[4475]: I1203 06:55:59.948858 4475 scope.go:117] "RemoveContainer" containerID="6de2d401c62c0b82b84c560e7fbdf0f3aa849cd94b4d5542285bedcc76efb375" Dec 03 06:56:00 crc kubenswrapper[4475]: I1203 06:56:00.956985 4475 generic.go:334] "Generic (PLEG): container finished" podID="4b61aba1-4e8e-4494-8ecd-3f082e5db4b3" containerID="8904f0b09a638311ff6ac5413da3d5ec9cd260300284ebdcf69c6381af1bc295" exitCode=0 Dec 03 06:56:00 crc kubenswrapper[4475]: I1203 06:56:00.957183 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cl9w7" event={"ID":"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3","Type":"ContainerDied","Data":"8904f0b09a638311ff6ac5413da3d5ec9cd260300284ebdcf69c6381af1bc295"} Dec 03 06:56:01 crc kubenswrapper[4475]: I1203 06:56:01.971591 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cl9w7" event={"ID":"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3","Type":"ContainerStarted","Data":"d116121601362955fb00082e132927c4335a0b7f2037edc2566b28bc9d9f151c"} Dec 03 06:56:01 crc kubenswrapper[4475]: I1203 06:56:01.972006 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cl9w7" event={"ID":"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3","Type":"ContainerStarted","Data":"0be4d6d41738df90f47561468d02cf2cb146dd005a51c54cc27addd0777d11ae"} Dec 03 06:56:01 crc kubenswrapper[4475]: I1203 06:56:01.972022 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cl9w7" event={"ID":"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3","Type":"ContainerStarted","Data":"667853930f0c597a2aa37d4f97fe1c5e3fda8bc64387ad6d77c6a989f2f97aff"} Dec 03 06:56:01 crc kubenswrapper[4475]: I1203 06:56:01.972033 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cl9w7" event={"ID":"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3","Type":"ContainerStarted","Data":"2dcbdf7254d0c347136bb281739671688de26b987241beab3edb5d5ac4f1ca2b"} Dec 03 06:56:01 crc kubenswrapper[4475]: I1203 06:56:01.972042 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cl9w7" event={"ID":"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3","Type":"ContainerStarted","Data":"d31525b9915c200aeae3d04e48239dbc9dfca1dfb31721d877eb4cb2593c389b"} Dec 03 06:56:01 crc kubenswrapper[4475]: I1203 06:56:01.972052 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cl9w7" event={"ID":"4b61aba1-4e8e-4494-8ecd-3f082e5db4b3","Type":"ContainerStarted","Data":"5d85b4912a6ae579737d2d416f6b2fa19b8a034bf8a05074f298fe6702a1bfe9"} Dec 03 06:56:01 crc kubenswrapper[4475]: I1203 06:56:01.972094 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:56:02 crc kubenswrapper[4475]: I1203 06:56:02.001434 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-cl9w7" podStartSLOduration=4.760041942 podStartE2EDuration="11.001419138s" podCreationTimestamp="2025-12-03 06:55:51 +0000 UTC" firstStartedPulling="2025-12-03 06:55:52.540295078 +0000 UTC m=+637.345193412" lastFinishedPulling="2025-12-03 06:55:58.781672275 +0000 UTC m=+643.586570608" observedRunningTime="2025-12-03 06:56:02.000800344 +0000 UTC m=+646.805698679" watchObservedRunningTime="2025-12-03 06:56:02.001419138 +0000 UTC m=+646.806317482" Dec 03 06:56:02 crc kubenswrapper[4475]: I1203 06:56:02.464897 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:56:02 crc kubenswrapper[4475]: I1203 06:56:02.498894 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:56:02 crc kubenswrapper[4475]: I1203 06:56:02.542481 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-9vqjd" Dec 03 06:56:03 crc kubenswrapper[4475]: I1203 06:56:03.426881 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-w6hc2" Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.439743 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qg9hs"] Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.441536 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qg9hs" Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.443398 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.447567 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-72sz2" Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.455933 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qg9hs"] Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.463470 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.540167 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbst5\" (UniqueName: \"kubernetes.io/projected/ac8ece8a-0806-452f-97a8-c28849b5192a-kube-api-access-gbst5\") pod \"openstack-operator-index-qg9hs\" (UID: \"ac8ece8a-0806-452f-97a8-c28849b5192a\") " pod="openstack-operators/openstack-operator-index-qg9hs" Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.641650 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbst5\" (UniqueName: \"kubernetes.io/projected/ac8ece8a-0806-452f-97a8-c28849b5192a-kube-api-access-gbst5\") pod \"openstack-operator-index-qg9hs\" (UID: \"ac8ece8a-0806-452f-97a8-c28849b5192a\") " pod="openstack-operators/openstack-operator-index-qg9hs" Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.659624 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbst5\" (UniqueName: \"kubernetes.io/projected/ac8ece8a-0806-452f-97a8-c28849b5192a-kube-api-access-gbst5\") pod \"openstack-operator-index-qg9hs\" (UID: \"ac8ece8a-0806-452f-97a8-c28849b5192a\") " pod="openstack-operators/openstack-operator-index-qg9hs" Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.757369 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qg9hs" Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.939709 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qg9hs"] Dec 03 06:56:05 crc kubenswrapper[4475]: I1203 06:56:05.996781 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qg9hs" event={"ID":"ac8ece8a-0806-452f-97a8-c28849b5192a","Type":"ContainerStarted","Data":"7b64dc411becbd80738473cbe46d7190c3619135325b81b52fa10b8579b990ba"} Dec 03 06:56:07 crc kubenswrapper[4475]: I1203 06:56:07.004240 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qg9hs" event={"ID":"ac8ece8a-0806-452f-97a8-c28849b5192a","Type":"ContainerStarted","Data":"9f0fac939e1e9f6f654a02936f424525a39fb4980baef75983fc6a2a36c0c839"} Dec 03 06:56:11 crc kubenswrapper[4475]: I1203 06:56:11.860776 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5fqhn" Dec 03 06:56:11 crc kubenswrapper[4475]: I1203 06:56:11.872299 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qg9hs" podStartSLOduration=6.138057388 podStartE2EDuration="6.872287313s" podCreationTimestamp="2025-12-03 06:56:05 +0000 UTC" firstStartedPulling="2025-12-03 06:56:05.947038219 +0000 UTC m=+650.751936554" lastFinishedPulling="2025-12-03 06:56:06.681268144 +0000 UTC m=+651.486166479" observedRunningTime="2025-12-03 06:56:07.022033777 +0000 UTC m=+651.826932111" watchObservedRunningTime="2025-12-03 06:56:11.872287313 +0000 UTC m=+656.677185648" Dec 03 06:56:12 crc kubenswrapper[4475]: I1203 06:56:12.468302 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-cl9w7" Dec 03 06:56:15 crc kubenswrapper[4475]: I1203 06:56:15.759047 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qg9hs" Dec 03 06:56:15 crc kubenswrapper[4475]: I1203 06:56:15.760028 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qg9hs" Dec 03 06:56:15 crc kubenswrapper[4475]: I1203 06:56:15.779498 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qg9hs" Dec 03 06:56:16 crc kubenswrapper[4475]: I1203 06:56:16.077585 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qg9hs" Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.760372 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr"] Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.761556 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.763206 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-h5kg6" Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.769755 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr"] Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.831229 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zll8k\" (UniqueName: \"kubernetes.io/projected/3c328c73-b2cc-453b-9544-4320ab920701-kube-api-access-zll8k\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.831297 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.831329 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.932913 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.932965 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.933010 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zll8k\" (UniqueName: \"kubernetes.io/projected/3c328c73-b2cc-453b-9544-4320ab920701-kube-api-access-zll8k\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.933335 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.933386 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:22 crc kubenswrapper[4475]: I1203 06:56:22.946898 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zll8k\" (UniqueName: \"kubernetes.io/projected/3c328c73-b2cc-453b-9544-4320ab920701-kube-api-access-zll8k\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:23 crc kubenswrapper[4475]: I1203 06:56:23.075063 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:23 crc kubenswrapper[4475]: I1203 06:56:23.409845 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr"] Dec 03 06:56:24 crc kubenswrapper[4475]: I1203 06:56:24.099814 4475 generic.go:334] "Generic (PLEG): container finished" podID="3c328c73-b2cc-453b-9544-4320ab920701" containerID="00a139b5d4939a120f888e201c051300931de29011ff271d7371b5625715d718" exitCode=0 Dec 03 06:56:24 crc kubenswrapper[4475]: I1203 06:56:24.099849 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" event={"ID":"3c328c73-b2cc-453b-9544-4320ab920701","Type":"ContainerDied","Data":"00a139b5d4939a120f888e201c051300931de29011ff271d7371b5625715d718"} Dec 03 06:56:24 crc kubenswrapper[4475]: I1203 06:56:24.100025 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" event={"ID":"3c328c73-b2cc-453b-9544-4320ab920701","Type":"ContainerStarted","Data":"c5411f8042406f1a8277ef5f8d433a6864f4d7c52e42c4b79fba08cbeccb8727"} Dec 03 06:56:25 crc kubenswrapper[4475]: I1203 06:56:25.105305 4475 generic.go:334] "Generic (PLEG): container finished" podID="3c328c73-b2cc-453b-9544-4320ab920701" containerID="3874bfe8c3bc3ec8870cb909aece6a67b2436482508cdf8af5d8a8e6576836b1" exitCode=0 Dec 03 06:56:25 crc kubenswrapper[4475]: I1203 06:56:25.105339 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" event={"ID":"3c328c73-b2cc-453b-9544-4320ab920701","Type":"ContainerDied","Data":"3874bfe8c3bc3ec8870cb909aece6a67b2436482508cdf8af5d8a8e6576836b1"} Dec 03 06:56:26 crc kubenswrapper[4475]: I1203 06:56:26.111833 4475 generic.go:334] "Generic (PLEG): container finished" podID="3c328c73-b2cc-453b-9544-4320ab920701" containerID="309d6e5ddfa7f2b27678c0342dfd5cd17ec81f20fbbf5d266eadcda4096e24f9" exitCode=0 Dec 03 06:56:26 crc kubenswrapper[4475]: I1203 06:56:26.111935 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" event={"ID":"3c328c73-b2cc-453b-9544-4320ab920701","Type":"ContainerDied","Data":"309d6e5ddfa7f2b27678c0342dfd5cd17ec81f20fbbf5d266eadcda4096e24f9"} Dec 03 06:56:27 crc kubenswrapper[4475]: I1203 06:56:27.295606 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:27 crc kubenswrapper[4475]: I1203 06:56:27.380711 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zll8k\" (UniqueName: \"kubernetes.io/projected/3c328c73-b2cc-453b-9544-4320ab920701-kube-api-access-zll8k\") pod \"3c328c73-b2cc-453b-9544-4320ab920701\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " Dec 03 06:56:27 crc kubenswrapper[4475]: I1203 06:56:27.380761 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-util\") pod \"3c328c73-b2cc-453b-9544-4320ab920701\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " Dec 03 06:56:27 crc kubenswrapper[4475]: I1203 06:56:27.380787 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-bundle\") pod \"3c328c73-b2cc-453b-9544-4320ab920701\" (UID: \"3c328c73-b2cc-453b-9544-4320ab920701\") " Dec 03 06:56:27 crc kubenswrapper[4475]: I1203 06:56:27.381401 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-bundle" (OuterVolumeSpecName: "bundle") pod "3c328c73-b2cc-453b-9544-4320ab920701" (UID: "3c328c73-b2cc-453b-9544-4320ab920701"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:56:27 crc kubenswrapper[4475]: I1203 06:56:27.384702 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c328c73-b2cc-453b-9544-4320ab920701-kube-api-access-zll8k" (OuterVolumeSpecName: "kube-api-access-zll8k") pod "3c328c73-b2cc-453b-9544-4320ab920701" (UID: "3c328c73-b2cc-453b-9544-4320ab920701"). InnerVolumeSpecName "kube-api-access-zll8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:56:27 crc kubenswrapper[4475]: I1203 06:56:27.390785 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-util" (OuterVolumeSpecName: "util") pod "3c328c73-b2cc-453b-9544-4320ab920701" (UID: "3c328c73-b2cc-453b-9544-4320ab920701"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:56:27 crc kubenswrapper[4475]: I1203 06:56:27.481989 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zll8k\" (UniqueName: \"kubernetes.io/projected/3c328c73-b2cc-453b-9544-4320ab920701-kube-api-access-zll8k\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:27 crc kubenswrapper[4475]: I1203 06:56:27.482021 4475 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-util\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:27 crc kubenswrapper[4475]: I1203 06:56:27.482031 4475 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c328c73-b2cc-453b-9544-4320ab920701-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:28 crc kubenswrapper[4475]: I1203 06:56:28.122554 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" event={"ID":"3c328c73-b2cc-453b-9544-4320ab920701","Type":"ContainerDied","Data":"c5411f8042406f1a8277ef5f8d433a6864f4d7c52e42c4b79fba08cbeccb8727"} Dec 03 06:56:28 crc kubenswrapper[4475]: I1203 06:56:28.122763 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5411f8042406f1a8277ef5f8d433a6864f4d7c52e42c4b79fba08cbeccb8727" Dec 03 06:56:28 crc kubenswrapper[4475]: I1203 06:56:28.122588 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068644zhzr" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.248375 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58"] Dec 03 06:56:30 crc kubenswrapper[4475]: E1203 06:56:30.248774 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c328c73-b2cc-453b-9544-4320ab920701" containerName="extract" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.248787 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c328c73-b2cc-453b-9544-4320ab920701" containerName="extract" Dec 03 06:56:30 crc kubenswrapper[4475]: E1203 06:56:30.248796 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c328c73-b2cc-453b-9544-4320ab920701" containerName="util" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.248801 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c328c73-b2cc-453b-9544-4320ab920701" containerName="util" Dec 03 06:56:30 crc kubenswrapper[4475]: E1203 06:56:30.248819 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c328c73-b2cc-453b-9544-4320ab920701" containerName="pull" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.248825 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c328c73-b2cc-453b-9544-4320ab920701" containerName="pull" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.248923 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c328c73-b2cc-453b-9544-4320ab920701" containerName="extract" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.249247 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.261741 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-thvmc" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.275983 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58"] Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.310816 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6h8w\" (UniqueName: \"kubernetes.io/projected/0b339176-2688-42cf-be82-09a2400c76bc-kube-api-access-j6h8w\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-8lf58\" (UID: \"0b339176-2688-42cf-be82-09a2400c76bc\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.412079 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6h8w\" (UniqueName: \"kubernetes.io/projected/0b339176-2688-42cf-be82-09a2400c76bc-kube-api-access-j6h8w\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-8lf58\" (UID: \"0b339176-2688-42cf-be82-09a2400c76bc\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.431106 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6h8w\" (UniqueName: \"kubernetes.io/projected/0b339176-2688-42cf-be82-09a2400c76bc-kube-api-access-j6h8w\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-8lf58\" (UID: \"0b339176-2688-42cf-be82-09a2400c76bc\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.561664 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58" Dec 03 06:56:30 crc kubenswrapper[4475]: I1203 06:56:30.920042 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58"] Dec 03 06:56:31 crc kubenswrapper[4475]: I1203 06:56:31.137070 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58" event={"ID":"0b339176-2688-42cf-be82-09a2400c76bc","Type":"ContainerStarted","Data":"13c15ebae52dacf18f68d2c156fdd98dce31063ed79ee923faf061c8226c901a"} Dec 03 06:56:35 crc kubenswrapper[4475]: I1203 06:56:35.155548 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58" event={"ID":"0b339176-2688-42cf-be82-09a2400c76bc","Type":"ContainerStarted","Data":"821f5be37c404b830d42a28da6aac6fb9d11fde438956b0dcdc8042bc0562e64"} Dec 03 06:56:35 crc kubenswrapper[4475]: I1203 06:56:35.156535 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58" Dec 03 06:56:35 crc kubenswrapper[4475]: I1203 06:56:35.179703 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58" podStartSLOduration=1.7672530850000001 podStartE2EDuration="5.179690142s" podCreationTimestamp="2025-12-03 06:56:30 +0000 UTC" firstStartedPulling="2025-12-03 06:56:30.930231978 +0000 UTC m=+675.735130312" lastFinishedPulling="2025-12-03 06:56:34.342669035 +0000 UTC m=+679.147567369" observedRunningTime="2025-12-03 06:56:35.178400786 +0000 UTC m=+679.983299141" watchObservedRunningTime="2025-12-03 06:56:35.179690142 +0000 UTC m=+679.984588476" Dec 03 06:56:40 crc kubenswrapper[4475]: I1203 06:56:40.564224 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-8lf58" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.401879 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.403089 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.404487 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-xv8vc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.405869 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.406696 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.410337 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qw7d5" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.432353 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.435162 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.439513 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.440298 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.441635 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-nwfcd" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.442256 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.443021 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.444361 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-k8slg" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.448626 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.452528 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.517291 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.531232 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.533003 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.536607 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-nbbn9" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.540340 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfmbl\" (UniqueName: \"kubernetes.io/projected/67f4c1a1-c620-41ff-ab1a-bc603c755c6e-kube-api-access-kfmbl\") pod \"barbican-operator-controller-manager-7d9dfd778-ddwp9\" (UID: \"67f4c1a1-c620-41ff-ab1a-bc603c755c6e\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.540402 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2vrz\" (UniqueName: \"kubernetes.io/projected/82ac6fd2-3363-4fa3-901b-90781ae2db4e-kube-api-access-m2vrz\") pod \"designate-operator-controller-manager-78b4bc895b-dxtxl\" (UID: \"82ac6fd2-3363-4fa3-901b-90781ae2db4e\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.540470 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwwf5\" (UniqueName: \"kubernetes.io/projected/17fcf4b6-1bbf-4bda-a621-f6b563c6d7ae-kube-api-access-rwwf5\") pod \"glance-operator-controller-manager-77987cd8cd-pfb7t\" (UID: \"17fcf4b6-1bbf-4bda-a621-f6b563c6d7ae\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.540507 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftmcc\" (UniqueName: \"kubernetes.io/projected/17bf114e-8421-4eb6-a2be-466746af3f1e-kube-api-access-ftmcc\") pod \"cinder-operator-controller-manager-859b6ccc6-w8zbc\" (UID: \"17bf114e-8421-4eb6-a2be-466746af3f1e\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.544604 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.552921 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.553202 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.553316 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rbpbs" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.553881 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.556763 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.557728 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.563979 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-sg9bv" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.564151 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-dstbh" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.571024 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.583463 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.604478 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.604514 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.619753 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.620902 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.625381 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.626375 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.626660 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-7v7rm" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.629396 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-88gdd" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.640500 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.641426 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfmbl\" (UniqueName: \"kubernetes.io/projected/67f4c1a1-c620-41ff-ab1a-bc603c755c6e-kube-api-access-kfmbl\") pod \"barbican-operator-controller-manager-7d9dfd778-ddwp9\" (UID: \"67f4c1a1-c620-41ff-ab1a-bc603c755c6e\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.641487 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl2rd\" (UniqueName: \"kubernetes.io/projected/3ad9f39b-3080-4828-aa52-20410ce25c66-kube-api-access-zl2rd\") pod \"horizon-operator-controller-manager-68c6d99b8f-wkdz6\" (UID: \"3ad9f39b-3080-4828-aa52-20410ce25c66\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.641514 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqnbt\" (UniqueName: \"kubernetes.io/projected/abd2d6fd-7465-4606-aba4-f6e6501e6a39-kube-api-access-wqnbt\") pod \"heat-operator-controller-manager-5f64f6f8bb-hjzmk\" (UID: \"abd2d6fd-7465-4606-aba4-f6e6501e6a39\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.641537 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2vrz\" (UniqueName: \"kubernetes.io/projected/82ac6fd2-3363-4fa3-901b-90781ae2db4e-kube-api-access-m2vrz\") pod \"designate-operator-controller-manager-78b4bc895b-dxtxl\" (UID: \"82ac6fd2-3363-4fa3-901b-90781ae2db4e\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.641575 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.641592 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhm49\" (UniqueName: \"kubernetes.io/projected/6996078d-75b0-42b4-89af-6b8f8f7be702-kube-api-access-qhm49\") pod \"ironic-operator-controller-manager-6c548fd776-2s7sc\" (UID: \"6996078d-75b0-42b4-89af-6b8f8f7be702\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.641609 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h96kf\" (UniqueName: \"kubernetes.io/projected/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-kube-api-access-h96kf\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.641631 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwf5\" (UniqueName: \"kubernetes.io/projected/17fcf4b6-1bbf-4bda-a621-f6b563c6d7ae-kube-api-access-rwwf5\") pod \"glance-operator-controller-manager-77987cd8cd-pfb7t\" (UID: \"17fcf4b6-1bbf-4bda-a621-f6b563c6d7ae\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.641668 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftmcc\" (UniqueName: \"kubernetes.io/projected/17bf114e-8421-4eb6-a2be-466746af3f1e-kube-api-access-ftmcc\") pod \"cinder-operator-controller-manager-859b6ccc6-w8zbc\" (UID: \"17bf114e-8421-4eb6-a2be-466746af3f1e\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.644794 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.645654 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.656729 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-9tmfn" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.680408 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.681220 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.686926 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwwf5\" (UniqueName: \"kubernetes.io/projected/17fcf4b6-1bbf-4bda-a621-f6b563c6d7ae-kube-api-access-rwwf5\") pod \"glance-operator-controller-manager-77987cd8cd-pfb7t\" (UID: \"17fcf4b6-1bbf-4bda-a621-f6b563c6d7ae\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.689551 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.690599 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.699315 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2vrz\" (UniqueName: \"kubernetes.io/projected/82ac6fd2-3363-4fa3-901b-90781ae2db4e-kube-api-access-m2vrz\") pod \"designate-operator-controller-manager-78b4bc895b-dxtxl\" (UID: \"82ac6fd2-3363-4fa3-901b-90781ae2db4e\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.711424 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-pdg6r" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.711896 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.712402 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftmcc\" (UniqueName: \"kubernetes.io/projected/17bf114e-8421-4eb6-a2be-466746af3f1e-kube-api-access-ftmcc\") pod \"cinder-operator-controller-manager-859b6ccc6-w8zbc\" (UID: \"17bf114e-8421-4eb6-a2be-466746af3f1e\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.714101 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.717022 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-kcpbc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.731526 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.739221 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfmbl\" (UniqueName: \"kubernetes.io/projected/67f4c1a1-c620-41ff-ab1a-bc603c755c6e-kube-api-access-kfmbl\") pod \"barbican-operator-controller-manager-7d9dfd778-ddwp9\" (UID: \"67f4c1a1-c620-41ff-ab1a-bc603c755c6e\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.742204 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl2rd\" (UniqueName: \"kubernetes.io/projected/3ad9f39b-3080-4828-aa52-20410ce25c66-kube-api-access-zl2rd\") pod \"horizon-operator-controller-manager-68c6d99b8f-wkdz6\" (UID: \"3ad9f39b-3080-4828-aa52-20410ce25c66\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.742233 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqnbt\" (UniqueName: \"kubernetes.io/projected/abd2d6fd-7465-4606-aba4-f6e6501e6a39-kube-api-access-wqnbt\") pod \"heat-operator-controller-manager-5f64f6f8bb-hjzmk\" (UID: \"abd2d6fd-7465-4606-aba4-f6e6501e6a39\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.742255 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk4wx\" (UniqueName: \"kubernetes.io/projected/db3ddcfa-8d48-4336-bcc9-04d361dfb8e7-kube-api-access-lk4wx\") pod \"keystone-operator-controller-manager-7765d96ddf-bg8cq\" (UID: \"db3ddcfa-8d48-4336-bcc9-04d361dfb8e7\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.742292 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.742309 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhm49\" (UniqueName: \"kubernetes.io/projected/6996078d-75b0-42b4-89af-6b8f8f7be702-kube-api-access-qhm49\") pod \"ironic-operator-controller-manager-6c548fd776-2s7sc\" (UID: \"6996078d-75b0-42b4-89af-6b8f8f7be702\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.742326 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h96kf\" (UniqueName: \"kubernetes.io/projected/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-kube-api-access-h96kf\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.742345 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdb7n\" (UniqueName: \"kubernetes.io/projected/0e10e107-5685-4ab7-a0d0-e2ede376c24e-kube-api-access-rdb7n\") pod \"manila-operator-controller-manager-7c79b5df47-xffjq\" (UID: \"0e10e107-5685-4ab7-a0d0-e2ede376c24e\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.742372 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzbjj\" (UniqueName: \"kubernetes.io/projected/4f41afa1-2e04-40ef-99ab-51303689a06d-kube-api-access-tzbjj\") pod \"mariadb-operator-controller-manager-56bbcc9d85-hq2sj\" (UID: \"4f41afa1-2e04-40ef-99ab-51303689a06d\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj" Dec 03 06:56:57 crc kubenswrapper[4475]: E1203 06:56:57.742758 4475 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 06:56:57 crc kubenswrapper[4475]: E1203 06:56:57.742811 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert podName:ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e nodeName:}" failed. No retries permitted until 2025-12-03 06:56:58.242791008 +0000 UTC m=+703.047689342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert") pod "infra-operator-controller-manager-57548d458d-7q4rl" (UID: "ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e") : secret "infra-operator-webhook-server-cert" not found Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.753604 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.758519 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.786658 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqnbt\" (UniqueName: \"kubernetes.io/projected/abd2d6fd-7465-4606-aba4-f6e6501e6a39-kube-api-access-wqnbt\") pod \"heat-operator-controller-manager-5f64f6f8bb-hjzmk\" (UID: \"abd2d6fd-7465-4606-aba4-f6e6501e6a39\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.789808 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.789883 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.789897 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-qj57b"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.790047 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h96kf\" (UniqueName: \"kubernetes.io/projected/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-kube-api-access-h96kf\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.791123 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-qj57b"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.791208 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.796727 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl2rd\" (UniqueName: \"kubernetes.io/projected/3ad9f39b-3080-4828-aa52-20410ce25c66-kube-api-access-zl2rd\") pod \"horizon-operator-controller-manager-68c6d99b8f-wkdz6\" (UID: \"3ad9f39b-3080-4828-aa52-20410ce25c66\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.797269 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-md6vb" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.802644 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.803892 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.816019 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhm49\" (UniqueName: \"kubernetes.io/projected/6996078d-75b0-42b4-89af-6b8f8f7be702-kube-api-access-qhm49\") pod \"ironic-operator-controller-manager-6c548fd776-2s7sc\" (UID: \"6996078d-75b0-42b4-89af-6b8f8f7be702\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.818186 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.818522 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-qwn7j" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.831735 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.833470 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.835884 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-sf66q" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.841532 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.842498 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.843309 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk4wx\" (UniqueName: \"kubernetes.io/projected/db3ddcfa-8d48-4336-bcc9-04d361dfb8e7-kube-api-access-lk4wx\") pod \"keystone-operator-controller-manager-7765d96ddf-bg8cq\" (UID: \"db3ddcfa-8d48-4336-bcc9-04d361dfb8e7\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.843361 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6srkb\" (UniqueName: \"kubernetes.io/projected/576ef351-044d-41c7-8292-4a81ff83296b-kube-api-access-6srkb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-mzq79\" (UID: \"576ef351-044d-41c7-8292-4a81ff83296b\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.843386 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdb7n\" (UniqueName: \"kubernetes.io/projected/0e10e107-5685-4ab7-a0d0-e2ede376c24e-kube-api-access-rdb7n\") pod \"manila-operator-controller-manager-7c79b5df47-xffjq\" (UID: \"0e10e107-5685-4ab7-a0d0-e2ede376c24e\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.843413 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzbjj\" (UniqueName: \"kubernetes.io/projected/4f41afa1-2e04-40ef-99ab-51303689a06d-kube-api-access-tzbjj\") pod \"mariadb-operator-controller-manager-56bbcc9d85-hq2sj\" (UID: \"4f41afa1-2e04-40ef-99ab-51303689a06d\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.843430 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nxr7\" (UniqueName: \"kubernetes.io/projected/8a43730c-24cb-4520-893b-0b4c4750cce8-kube-api-access-8nxr7\") pod \"nova-operator-controller-manager-697bc559fc-7hmqk\" (UID: \"8a43730c-24cb-4520-893b-0b4c4750cce8\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.846129 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-7sk7k" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.855351 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.856027 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.861617 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk4wx\" (UniqueName: \"kubernetes.io/projected/db3ddcfa-8d48-4336-bcc9-04d361dfb8e7-kube-api-access-lk4wx\") pod \"keystone-operator-controller-manager-7765d96ddf-bg8cq\" (UID: \"db3ddcfa-8d48-4336-bcc9-04d361dfb8e7\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.869062 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdb7n\" (UniqueName: \"kubernetes.io/projected/0e10e107-5685-4ab7-a0d0-e2ede376c24e-kube-api-access-rdb7n\") pod \"manila-operator-controller-manager-7c79b5df47-xffjq\" (UID: \"0e10e107-5685-4ab7-a0d0-e2ede376c24e\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.886433 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.901127 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.905474 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzbjj\" (UniqueName: \"kubernetes.io/projected/4f41afa1-2e04-40ef-99ab-51303689a06d-kube-api-access-tzbjj\") pod \"mariadb-operator-controller-manager-56bbcc9d85-hq2sj\" (UID: \"4f41afa1-2e04-40ef-99ab-51303689a06d\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.934507 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.936712 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.936852 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.939039 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-2lmbl" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.944955 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686c8cxj\" (UID: \"033c67ed-2564-4f1c-a477-ac9b901c6b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.945004 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flccd\" (UniqueName: \"kubernetes.io/projected/542d827b-14f9-4d32-bf00-1ff3749b9cb9-kube-api-access-flccd\") pod \"ovn-operator-controller-manager-b6456fdb6-8fnp4\" (UID: \"542d827b-14f9-4d32-bf00-1ff3749b9cb9\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.945066 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6srkb\" (UniqueName: \"kubernetes.io/projected/576ef351-044d-41c7-8292-4a81ff83296b-kube-api-access-6srkb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-mzq79\" (UID: \"576ef351-044d-41c7-8292-4a81ff83296b\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.945107 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbp49\" (UniqueName: \"kubernetes.io/projected/bb78bd55-770d-43ca-a8a2-8e22819004f9-kube-api-access-nbp49\") pod \"placement-operator-controller-manager-78f8948974-kmfjk\" (UID: \"bb78bd55-770d-43ca-a8a2-8e22819004f9\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.945128 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh9wj\" (UniqueName: \"kubernetes.io/projected/033c67ed-2564-4f1c-a477-ac9b901c6b0d-kube-api-access-xh9wj\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686c8cxj\" (UID: \"033c67ed-2564-4f1c-a477-ac9b901c6b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.945173 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nxr7\" (UniqueName: \"kubernetes.io/projected/8a43730c-24cb-4520-893b-0b4c4750cce8-kube-api-access-8nxr7\") pod \"nova-operator-controller-manager-697bc559fc-7hmqk\" (UID: \"8a43730c-24cb-4520-893b-0b4c4750cce8\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.945208 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56dvr\" (UniqueName: \"kubernetes.io/projected/de989a5f-3b3a-4576-bc74-fb7484356b12-kube-api-access-56dvr\") pod \"octavia-operator-controller-manager-998648c74-qj57b\" (UID: \"de989a5f-3b3a-4576-bc74-fb7484356b12\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.955172 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.967851 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj"] Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.968644 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.970389 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6srkb\" (UniqueName: \"kubernetes.io/projected/576ef351-044d-41c7-8292-4a81ff83296b-kube-api-access-6srkb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-mzq79\" (UID: \"576ef351-044d-41c7-8292-4a81ff83296b\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" Dec 03 06:56:57 crc kubenswrapper[4475]: I1203 06:56:57.969099 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.016539 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nxr7\" (UniqueName: \"kubernetes.io/projected/8a43730c-24cb-4520-893b-0b4c4750cce8-kube-api-access-8nxr7\") pod \"nova-operator-controller-manager-697bc559fc-7hmqk\" (UID: \"8a43730c-24cb-4520-893b-0b4c4750cce8\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.020328 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.030680 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.035217 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.036264 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.037720 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-74tjt" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.047303 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686c8cxj\" (UID: \"033c67ed-2564-4f1c-a477-ac9b901c6b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.047357 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flccd\" (UniqueName: \"kubernetes.io/projected/542d827b-14f9-4d32-bf00-1ff3749b9cb9-kube-api-access-flccd\") pod \"ovn-operator-controller-manager-b6456fdb6-8fnp4\" (UID: \"542d827b-14f9-4d32-bf00-1ff3749b9cb9\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.047428 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbp49\" (UniqueName: \"kubernetes.io/projected/bb78bd55-770d-43ca-a8a2-8e22819004f9-kube-api-access-nbp49\") pod \"placement-operator-controller-manager-78f8948974-kmfjk\" (UID: \"bb78bd55-770d-43ca-a8a2-8e22819004f9\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.047460 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh9wj\" (UniqueName: \"kubernetes.io/projected/033c67ed-2564-4f1c-a477-ac9b901c6b0d-kube-api-access-xh9wj\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686c8cxj\" (UID: \"033c67ed-2564-4f1c-a477-ac9b901c6b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.047684 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r7sx\" (UniqueName: \"kubernetes.io/projected/31bcdf0e-dd3a-4219-8ca4-15be9d19172f-kube-api-access-9r7sx\") pod \"swift-operator-controller-manager-5f8c65bbfc-vgtpk\" (UID: \"31bcdf0e-dd3a-4219-8ca4-15be9d19172f\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.047833 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56dvr\" (UniqueName: \"kubernetes.io/projected/de989a5f-3b3a-4576-bc74-fb7484356b12-kube-api-access-56dvr\") pod \"octavia-operator-controller-manager-998648c74-qj57b\" (UID: \"de989a5f-3b3a-4576-bc74-fb7484356b12\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.049215 4475 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.049493 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert podName:033c67ed-2564-4f1c-a477-ac9b901c6b0d nodeName:}" failed. No retries permitted until 2025-12-03 06:56:58.549478579 +0000 UTC m=+703.354376913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" (UID: "033c67ed-2564-4f1c-a477-ac9b901c6b0d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.066849 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.088647 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-csbzh"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.089978 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.091386 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.092347 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbp49\" (UniqueName: \"kubernetes.io/projected/bb78bd55-770d-43ca-a8a2-8e22819004f9-kube-api-access-nbp49\") pod \"placement-operator-controller-manager-78f8948974-kmfjk\" (UID: \"bb78bd55-770d-43ca-a8a2-8e22819004f9\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.092349 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56dvr\" (UniqueName: \"kubernetes.io/projected/de989a5f-3b3a-4576-bc74-fb7484356b12-kube-api-access-56dvr\") pod \"octavia-operator-controller-manager-998648c74-qj57b\" (UID: \"de989a5f-3b3a-4576-bc74-fb7484356b12\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.092777 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flccd\" (UniqueName: \"kubernetes.io/projected/542d827b-14f9-4d32-bf00-1ff3749b9cb9-kube-api-access-flccd\") pod \"ovn-operator-controller-manager-b6456fdb6-8fnp4\" (UID: \"542d827b-14f9-4d32-bf00-1ff3749b9cb9\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.092941 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh9wj\" (UniqueName: \"kubernetes.io/projected/033c67ed-2564-4f1c-a477-ac9b901c6b0d-kube-api-access-xh9wj\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686c8cxj\" (UID: \"033c67ed-2564-4f1c-a477-ac9b901c6b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.099063 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-548lj" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.105231 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-csbzh"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.132912 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.148986 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r7sx\" (UniqueName: \"kubernetes.io/projected/31bcdf0e-dd3a-4219-8ca4-15be9d19172f-kube-api-access-9r7sx\") pod \"swift-operator-controller-manager-5f8c65bbfc-vgtpk\" (UID: \"31bcdf0e-dd3a-4219-8ca4-15be9d19172f\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.149037 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqrlf\" (UniqueName: \"kubernetes.io/projected/67c05515-1eb8-4c77-acbc-70ea437f1313-kube-api-access-qqrlf\") pod \"telemetry-operator-controller-manager-76cc84c6bb-zq8pq\" (UID: \"67c05515-1eb8-4c77-acbc-70ea437f1313\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.149078 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8xch\" (UniqueName: \"kubernetes.io/projected/d569ccab-198f-441f-add9-b7c06254c8d0-kube-api-access-l8xch\") pod \"test-operator-controller-manager-5854674fcc-csbzh\" (UID: \"d569ccab-198f-441f-add9-b7c06254c8d0\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.162736 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.163795 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.165042 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.170072 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6dp4s" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.178945 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.179483 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.180727 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r7sx\" (UniqueName: \"kubernetes.io/projected/31bcdf0e-dd3a-4219-8ca4-15be9d19172f-kube-api-access-9r7sx\") pod \"swift-operator-controller-manager-5f8c65bbfc-vgtpk\" (UID: \"31bcdf0e-dd3a-4219-8ca4-15be9d19172f\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.203091 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.204093 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.208892 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.211333 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.221623 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.221817 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.222673 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wjzk4" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.250175 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkbcq\" (UniqueName: \"kubernetes.io/projected/fede2bdc-4902-40e6-b0cf-9827f5bbe268-kube-api-access-qkbcq\") pod \"watcher-operator-controller-manager-769dc69bc-kmdw2\" (UID: \"fede2bdc-4902-40e6-b0cf-9827f5bbe268\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.250214 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzjfn\" (UniqueName: \"kubernetes.io/projected/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-kube-api-access-mzjfn\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.250255 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.250283 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqrlf\" (UniqueName: \"kubernetes.io/projected/67c05515-1eb8-4c77-acbc-70ea437f1313-kube-api-access-qqrlf\") pod \"telemetry-operator-controller-manager-76cc84c6bb-zq8pq\" (UID: \"67c05515-1eb8-4c77-acbc-70ea437f1313\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.250336 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.250357 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8xch\" (UniqueName: \"kubernetes.io/projected/d569ccab-198f-441f-add9-b7c06254c8d0-kube-api-access-l8xch\") pod \"test-operator-controller-manager-5854674fcc-csbzh\" (UID: \"d569ccab-198f-441f-add9-b7c06254c8d0\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.250483 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.250643 4475 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.250697 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert podName:ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e nodeName:}" failed. No retries permitted until 2025-12-03 06:56:59.250678714 +0000 UTC m=+704.055577048 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert") pod "infra-operator-controller-manager-57548d458d-7q4rl" (UID: "ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e") : secret "infra-operator-webhook-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.252194 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.262399 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.273711 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-nqmt9" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.277502 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqrlf\" (UniqueName: \"kubernetes.io/projected/67c05515-1eb8-4c77-acbc-70ea437f1313-kube-api-access-qqrlf\") pod \"telemetry-operator-controller-manager-76cc84c6bb-zq8pq\" (UID: \"67c05515-1eb8-4c77-acbc-70ea437f1313\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.295953 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8xch\" (UniqueName: \"kubernetes.io/projected/d569ccab-198f-441f-add9-b7c06254c8d0-kube-api-access-l8xch\") pod \"test-operator-controller-manager-5854674fcc-csbzh\" (UID: \"d569ccab-198f-441f-add9-b7c06254c8d0\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.296020 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.319938 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.351921 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkbcq\" (UniqueName: \"kubernetes.io/projected/fede2bdc-4902-40e6-b0cf-9827f5bbe268-kube-api-access-qkbcq\") pod \"watcher-operator-controller-manager-769dc69bc-kmdw2\" (UID: \"fede2bdc-4902-40e6-b0cf-9827f5bbe268\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.351962 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzjfn\" (UniqueName: \"kubernetes.io/projected/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-kube-api-access-mzjfn\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.352007 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.352050 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.352100 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skl9v\" (UniqueName: \"kubernetes.io/projected/5afd068b-0798-44fc-a0c3-498e5d3803df-kube-api-access-skl9v\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lhbp6\" (UID: \"5afd068b-0798-44fc-a0c3-498e5d3803df\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6" Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.352655 4475 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.352693 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs podName:454cf583-e2ff-4ad7-a07e-f7d4e881c67b nodeName:}" failed. No retries permitted until 2025-12-03 06:56:58.852680522 +0000 UTC m=+703.657578846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-kk65s" (UID: "454cf583-e2ff-4ad7-a07e-f7d4e881c67b") : secret "webhook-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.352852 4475 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.352887 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs podName:454cf583-e2ff-4ad7-a07e-f7d4e881c67b nodeName:}" failed. No retries permitted until 2025-12-03 06:56:58.852879446 +0000 UTC m=+703.657777780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-kk65s" (UID: "454cf583-e2ff-4ad7-a07e-f7d4e881c67b") : secret "metrics-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.377892 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkbcq\" (UniqueName: \"kubernetes.io/projected/fede2bdc-4902-40e6-b0cf-9827f5bbe268-kube-api-access-qkbcq\") pod \"watcher-operator-controller-manager-769dc69bc-kmdw2\" (UID: \"fede2bdc-4902-40e6-b0cf-9827f5bbe268\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.378024 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzjfn\" (UniqueName: \"kubernetes.io/projected/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-kube-api-access-mzjfn\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.378219 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.414666 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.423015 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.456985 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skl9v\" (UniqueName: \"kubernetes.io/projected/5afd068b-0798-44fc-a0c3-498e5d3803df-kube-api-access-skl9v\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lhbp6\" (UID: \"5afd068b-0798-44fc-a0c3-498e5d3803df\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.485666 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skl9v\" (UniqueName: \"kubernetes.io/projected/5afd068b-0798-44fc-a0c3-498e5d3803df-kube-api-access-skl9v\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lhbp6\" (UID: \"5afd068b-0798-44fc-a0c3-498e5d3803df\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.502400 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.558685 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686c8cxj\" (UID: \"033c67ed-2564-4f1c-a477-ac9b901c6b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.559079 4475 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.559127 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert podName:033c67ed-2564-4f1c-a477-ac9b901c6b0d nodeName:}" failed. No retries permitted until 2025-12-03 06:56:59.559113429 +0000 UTC m=+704.364011764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" (UID: "033c67ed-2564-4f1c-a477-ac9b901c6b0d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.608502 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.656959 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.709014 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.807142 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.823526 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj"] Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.864493 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:58 crc kubenswrapper[4475]: I1203 06:56:58.864663 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.864930 4475 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.864986 4475 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.865022 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs podName:454cf583-e2ff-4ad7-a07e-f7d4e881c67b nodeName:}" failed. No retries permitted until 2025-12-03 06:56:59.865005865 +0000 UTC m=+704.669904189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-kk65s" (UID: "454cf583-e2ff-4ad7-a07e-f7d4e881c67b") : secret "metrics-server-cert" not found Dec 03 06:56:58 crc kubenswrapper[4475]: E1203 06:56:58.865990 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs podName:454cf583-e2ff-4ad7-a07e-f7d4e881c67b nodeName:}" failed. No retries permitted until 2025-12-03 06:56:59.865976781 +0000 UTC m=+704.670875115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-kk65s" (UID: "454cf583-e2ff-4ad7-a07e-f7d4e881c67b") : secret "webhook-server-cert" not found Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.197217 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79"] Dec 03 06:56:59 crc kubenswrapper[4475]: W1203 06:56:59.203984 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod576ef351_044d_41c7_8292_4a81ff83296b.slice/crio-17e6a4e9591111dd13d6368e516716e9ec3198a952538042fdc828db2f459721 WatchSource:0}: Error finding container 17e6a4e9591111dd13d6368e516716e9ec3198a952538042fdc828db2f459721: Status 404 returned error can't find the container with id 17e6a4e9591111dd13d6368e516716e9ec3198a952538042fdc828db2f459721 Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.223694 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk"] Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.231066 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc"] Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.261407 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6"] Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.272206 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.272343 4475 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.272389 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert podName:ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e nodeName:}" failed. No retries permitted until 2025-12-03 06:57:01.272375522 +0000 UTC m=+706.077273856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert") pod "infra-operator-controller-manager-57548d458d-7q4rl" (UID: "ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e") : secret "infra-operator-webhook-server-cert" not found Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.273610 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9"] Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.286526 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk"] Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.289038 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq"] Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.295111 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4"] Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.327810 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-56dvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-qj57b_openstack-operators(de989a5f-3b3a-4576-bc74-fb7484356b12): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.335411 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-56dvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-qj57b_openstack-operators(de989a5f-3b3a-4576-bc74-fb7484356b12): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.335558 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" event={"ID":"db3ddcfa-8d48-4336-bcc9-04d361dfb8e7","Type":"ContainerStarted","Data":"5f5e8ebb6c2a4a0b9147d310abe62f7c89035a648f42c0928cbf84036b886a3e"} Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.337185 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" podUID="de989a5f-3b3a-4576-bc74-fb7484356b12" Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.337435 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-qj57b"] Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.337725 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9r7sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-vgtpk_openstack-operators(31bcdf0e-dd3a-4219-8ca4-15be9d19172f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.338243 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" event={"ID":"0e10e107-5685-4ab7-a0d0-e2ede376c24e","Type":"ContainerStarted","Data":"486200cace6542c3eab836681a4fc82ce26307d9701595ece99ffad3b132d57c"} Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.339674 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9r7sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-vgtpk_openstack-operators(31bcdf0e-dd3a-4219-8ca4-15be9d19172f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.340841 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" podUID="31bcdf0e-dd3a-4219-8ca4-15be9d19172f" Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.345263 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc" event={"ID":"6996078d-75b0-42b4-89af-6b8f8f7be702","Type":"ContainerStarted","Data":"860d2ce988c2799b59354f16aadc8332192e9355e50ff809b7ba05594426260f"} Dec 03 06:56:59 crc kubenswrapper[4475]: W1203 06:56:59.346155 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c05515_1eb8_4c77_acbc_70ea437f1313.slice/crio-0186dc0ddf74a72caffabda08ca3a689fc3595b00123db808788c45558d68690 WatchSource:0}: Error finding container 0186dc0ddf74a72caffabda08ca3a689fc3595b00123db808788c45558d68690: Status 404 returned error can't find the container with id 0186dc0ddf74a72caffabda08ca3a689fc3595b00123db808788c45558d68690 Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.347742 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk" event={"ID":"bb78bd55-770d-43ca-a8a2-8e22819004f9","Type":"ContainerStarted","Data":"e200a90a1a8cac13695006cf8b6b4ce109630a3b15af5bb94833b15336928bff"} Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.348333 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-csbzh"] Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.350084 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wqnbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-hjzmk_openstack-operators(abd2d6fd-7465-4606-aba4-f6e6501e6a39): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.352019 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t" event={"ID":"17fcf4b6-1bbf-4bda-a621-f6b563c6d7ae","Type":"ContainerStarted","Data":"ca9165e59c43afd9b9a7f3588960340c4d593b0f18e93b7af6499a4a9c4ba700"} Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.352018 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wqnbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-hjzmk_openstack-operators(abd2d6fd-7465-4606-aba4-f6e6501e6a39): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.352185 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qqrlf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-zq8pq_openstack-operators(67c05515-1eb8-4c77-acbc-70ea437f1313): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.353272 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l8xch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-csbzh_openstack-operators(d569ccab-198f-441f-add9-b7c06254c8d0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.353348 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" podUID="abd2d6fd-7465-4606-aba4-f6e6501e6a39" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.355240 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-skl9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-lhbp6_openstack-operators(5afd068b-0798-44fc-a0c3-498e5d3803df): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.355327 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qqrlf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-zq8pq_openstack-operators(67c05515-1eb8-4c77-acbc-70ea437f1313): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.356154 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4" event={"ID":"542d827b-14f9-4d32-bf00-1ff3749b9cb9","Type":"ContainerStarted","Data":"d89783b0315baad504d21b039bc7c0ed12fb0a4a78515b7b4f2be1e35b2d7da9"} Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.356271 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l8xch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-csbzh_openstack-operators(d569ccab-198f-441f-add9-b7c06254c8d0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.356348 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6" podUID="5afd068b-0798-44fc-a0c3-498e5d3803df" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.356402 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" podUID="67c05515-1eb8-4c77-acbc-70ea437f1313" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.356584 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qkbcq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-kmdw2_openstack-operators(fede2bdc-4902-40e6-b0cf-9827f5bbe268): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.356846 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6"] Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.357490 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" podUID="d569ccab-198f-441f-add9-b7c06254c8d0" Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.358076 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" event={"ID":"576ef351-044d-41c7-8292-4a81ff83296b","Type":"ContainerStarted","Data":"17e6a4e9591111dd13d6368e516716e9ec3198a952538042fdc828db2f459721"} Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.358262 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qkbcq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-kmdw2_openstack-operators(fede2bdc-4902-40e6-b0cf-9827f5bbe268): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.359200 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc" event={"ID":"17bf114e-8421-4eb6-a2be-466746af3f1e","Type":"ContainerStarted","Data":"5c7ef550052f4507f86b8d88403e242b0bbd39efdf1b5105ed23dc33305a85a2"} Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.359611 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" podUID="fede2bdc-4902-40e6-b0cf-9827f5bbe268" Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.360216 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk"] Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.360245 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" event={"ID":"3ad9f39b-3080-4828-aa52-20410ce25c66","Type":"ContainerStarted","Data":"0c0d5e8bf2177ec5e667b4278ad2f2cde48cfff09e5e1ae99edd358c71c6bfb3"} Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.361834 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9" event={"ID":"67f4c1a1-c620-41ff-ab1a-bc603c755c6e","Type":"ContainerStarted","Data":"ed95cb8058ed951cc734791a91f8aa638e5be6775f6be44d67627c5281de510a"} Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.363515 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" event={"ID":"8a43730c-24cb-4520-893b-0b4c4750cce8","Type":"ContainerStarted","Data":"08f90a366b84e0fa898d6d6c9497f131d8b4090f70482988f544212c1a321380"} Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.364390 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" event={"ID":"82ac6fd2-3363-4fa3-901b-90781ae2db4e","Type":"ContainerStarted","Data":"2cd5b325194644f8859658c5de2d96492d9caeebb77e00b214aa422848b8665d"} Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.365902 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj" event={"ID":"4f41afa1-2e04-40ef-99ab-51303689a06d","Type":"ContainerStarted","Data":"92c08abb1d7fe993264a81d6d248e89277eca3a639e0084cb56096d99f269602"} Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.378182 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk"] Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.384759 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq"] Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.392429 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2"] Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.577987 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686c8cxj\" (UID: \"033c67ed-2564-4f1c-a477-ac9b901c6b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.578135 4475 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.578176 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert podName:033c67ed-2564-4f1c-a477-ac9b901c6b0d nodeName:}" failed. No retries permitted until 2025-12-03 06:57:01.57816394 +0000 UTC m=+706.383062275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" (UID: "033c67ed-2564-4f1c-a477-ac9b901c6b0d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.880654 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:59 crc kubenswrapper[4475]: I1203 06:56:59.880728 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.880879 4475 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.880931 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs podName:454cf583-e2ff-4ad7-a07e-f7d4e881c67b nodeName:}" failed. No retries permitted until 2025-12-03 06:57:01.880920306 +0000 UTC m=+706.685818631 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-kk65s" (UID: "454cf583-e2ff-4ad7-a07e-f7d4e881c67b") : secret "metrics-server-cert" not found Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.880974 4475 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 06:56:59 crc kubenswrapper[4475]: E1203 06:56:59.881000 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs podName:454cf583-e2ff-4ad7-a07e-f7d4e881c67b nodeName:}" failed. No retries permitted until 2025-12-03 06:57:01.880993685 +0000 UTC m=+706.685892019 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-kk65s" (UID: "454cf583-e2ff-4ad7-a07e-f7d4e881c67b") : secret "webhook-server-cert" not found Dec 03 06:57:00 crc kubenswrapper[4475]: I1203 06:57:00.380709 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" event={"ID":"de989a5f-3b3a-4576-bc74-fb7484356b12","Type":"ContainerStarted","Data":"643ec1ccd8f7f6b9d3893e1e3b617dc3ba4b953fb66e5ebe05c34132096bc538"} Dec 03 06:57:00 crc kubenswrapper[4475]: I1203 06:57:00.383973 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" event={"ID":"fede2bdc-4902-40e6-b0cf-9827f5bbe268","Type":"ContainerStarted","Data":"539ba59b921f9aeb9a449075f5e693f077f5e326cdba3d7a94b95959d1fb1b9c"} Dec 03 06:57:00 crc kubenswrapper[4475]: I1203 06:57:00.386988 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6" event={"ID":"5afd068b-0798-44fc-a0c3-498e5d3803df","Type":"ContainerStarted","Data":"81db83e5cf1100bb90dafd7dc220f999ecfad89162925d7a210fa1193e042111"} Dec 03 06:57:00 crc kubenswrapper[4475]: E1203 06:57:00.387893 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" podUID="de989a5f-3b3a-4576-bc74-fb7484356b12" Dec 03 06:57:00 crc kubenswrapper[4475]: E1203 06:57:00.389014 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6" podUID="5afd068b-0798-44fc-a0c3-498e5d3803df" Dec 03 06:57:00 crc kubenswrapper[4475]: E1203 06:57:00.396577 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" podUID="fede2bdc-4902-40e6-b0cf-9827f5bbe268" Dec 03 06:57:00 crc kubenswrapper[4475]: I1203 06:57:00.397302 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" event={"ID":"31bcdf0e-dd3a-4219-8ca4-15be9d19172f","Type":"ContainerStarted","Data":"83a24a877354a28ae261c0159969185aca3ac43db5a02c03a2a5e5a209f2fc52"} Dec 03 06:57:00 crc kubenswrapper[4475]: E1203 06:57:00.414145 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" podUID="31bcdf0e-dd3a-4219-8ca4-15be9d19172f" Dec 03 06:57:00 crc kubenswrapper[4475]: I1203 06:57:00.414508 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" event={"ID":"abd2d6fd-7465-4606-aba4-f6e6501e6a39","Type":"ContainerStarted","Data":"449553513f954dfbbbc0386b0753ba2106f22713c25b19c65ef35434802ef949"} Dec 03 06:57:00 crc kubenswrapper[4475]: I1203 06:57:00.424469 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" event={"ID":"67c05515-1eb8-4c77-acbc-70ea437f1313","Type":"ContainerStarted","Data":"0186dc0ddf74a72caffabda08ca3a689fc3595b00123db808788c45558d68690"} Dec 03 06:57:00 crc kubenswrapper[4475]: E1203 06:57:00.425281 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" podUID="abd2d6fd-7465-4606-aba4-f6e6501e6a39" Dec 03 06:57:00 crc kubenswrapper[4475]: I1203 06:57:00.426316 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" event={"ID":"d569ccab-198f-441f-add9-b7c06254c8d0","Type":"ContainerStarted","Data":"4cae7e95855fbbbbcaf51c939df093233a70cd478ad90c5955075d6c56e27c92"} Dec 03 06:57:00 crc kubenswrapper[4475]: E1203 06:57:00.430402 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" podUID="d569ccab-198f-441f-add9-b7c06254c8d0" Dec 03 06:57:00 crc kubenswrapper[4475]: E1203 06:57:00.430532 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" podUID="67c05515-1eb8-4c77-acbc-70ea437f1313" Dec 03 06:57:01 crc kubenswrapper[4475]: I1203 06:57:01.312895 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.313077 4475 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.313124 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert podName:ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e nodeName:}" failed. No retries permitted until 2025-12-03 06:57:05.31311146 +0000 UTC m=+710.118009794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert") pod "infra-operator-controller-manager-57548d458d-7q4rl" (UID: "ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e") : secret "infra-operator-webhook-server-cert" not found Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.443139 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6" podUID="5afd068b-0798-44fc-a0c3-498e5d3803df" Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.444226 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" podUID="67c05515-1eb8-4c77-acbc-70ea437f1313" Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.444675 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" podUID="31bcdf0e-dd3a-4219-8ca4-15be9d19172f" Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.445036 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" podUID="fede2bdc-4902-40e6-b0cf-9827f5bbe268" Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.447812 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" podUID="de989a5f-3b3a-4576-bc74-fb7484356b12" Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.448804 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" podUID="abd2d6fd-7465-4606-aba4-f6e6501e6a39" Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.448805 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" podUID="d569ccab-198f-441f-add9-b7c06254c8d0" Dec 03 06:57:01 crc kubenswrapper[4475]: I1203 06:57:01.624228 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686c8cxj\" (UID: \"033c67ed-2564-4f1c-a477-ac9b901c6b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.625359 4475 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.625415 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert podName:033c67ed-2564-4f1c-a477-ac9b901c6b0d nodeName:}" failed. No retries permitted until 2025-12-03 06:57:05.625401969 +0000 UTC m=+710.430300293 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" (UID: "033c67ed-2564-4f1c-a477-ac9b901c6b0d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:57:01 crc kubenswrapper[4475]: I1203 06:57:01.931622 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.931763 4475 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.931855 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs podName:454cf583-e2ff-4ad7-a07e-f7d4e881c67b nodeName:}" failed. No retries permitted until 2025-12-03 06:57:05.931838487 +0000 UTC m=+710.736736821 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-kk65s" (UID: "454cf583-e2ff-4ad7-a07e-f7d4e881c67b") : secret "metrics-server-cert" not found Dec 03 06:57:01 crc kubenswrapper[4475]: I1203 06:57:01.932040 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.932752 4475 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 06:57:01 crc kubenswrapper[4475]: E1203 06:57:01.933295 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs podName:454cf583-e2ff-4ad7-a07e-f7d4e881c67b nodeName:}" failed. No retries permitted until 2025-12-03 06:57:05.933262977 +0000 UTC m=+710.738161310 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-kk65s" (UID: "454cf583-e2ff-4ad7-a07e-f7d4e881c67b") : secret "webhook-server-cert" not found Dec 03 06:57:05 crc kubenswrapper[4475]: I1203 06:57:05.383465 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:57:05 crc kubenswrapper[4475]: E1203 06:57:05.383678 4475 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 06:57:05 crc kubenswrapper[4475]: E1203 06:57:05.384425 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert podName:ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e nodeName:}" failed. No retries permitted until 2025-12-03 06:57:13.38439474 +0000 UTC m=+718.189293074 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert") pod "infra-operator-controller-manager-57548d458d-7q4rl" (UID: "ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e") : secret "infra-operator-webhook-server-cert" not found Dec 03 06:57:05 crc kubenswrapper[4475]: I1203 06:57:05.693542 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686c8cxj\" (UID: \"033c67ed-2564-4f1c-a477-ac9b901c6b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:57:05 crc kubenswrapper[4475]: E1203 06:57:05.693833 4475 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:57:05 crc kubenswrapper[4475]: E1203 06:57:05.693888 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert podName:033c67ed-2564-4f1c-a477-ac9b901c6b0d nodeName:}" failed. No retries permitted until 2025-12-03 06:57:13.693868692 +0000 UTC m=+718.498767025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" (UID: "033c67ed-2564-4f1c-a477-ac9b901c6b0d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:57:05 crc kubenswrapper[4475]: I1203 06:57:05.999077 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:05 crc kubenswrapper[4475]: I1203 06:57:05.999161 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:05 crc kubenswrapper[4475]: E1203 06:57:05.999285 4475 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:57:05 crc kubenswrapper[4475]: E1203 06:57:05.999341 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs podName:454cf583-e2ff-4ad7-a07e-f7d4e881c67b nodeName:}" failed. No retries permitted until 2025-12-03 06:57:13.999324485 +0000 UTC m=+718.804222819 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-kk65s" (UID: "454cf583-e2ff-4ad7-a07e-f7d4e881c67b") : secret "metrics-server-cert" not found Dec 03 06:57:05 crc kubenswrapper[4475]: E1203 06:57:05.999283 4475 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 06:57:05 crc kubenswrapper[4475]: E1203 06:57:05.999423 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs podName:454cf583-e2ff-4ad7-a07e-f7d4e881c67b nodeName:}" failed. No retries permitted until 2025-12-03 06:57:13.999408012 +0000 UTC m=+718.804306346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-kk65s" (UID: "454cf583-e2ff-4ad7-a07e-f7d4e881c67b") : secret "webhook-server-cert" not found Dec 03 06:57:11 crc kubenswrapper[4475]: E1203 06:57:11.142003 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 03 06:57:11 crc kubenswrapper[4475]: E1203 06:57:11.142322 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zl2rd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-wkdz6_openstack-operators(3ad9f39b-3080-4828-aa52-20410ce25c66): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:57:12 crc kubenswrapper[4475]: E1203 06:57:12.046390 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9" Dec 03 06:57:12 crc kubenswrapper[4475]: E1203 06:57:12.046711 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rdb7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-xffjq_openstack-operators(0e10e107-5685-4ab7-a0d0-e2ede376c24e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:57:12 crc kubenswrapper[4475]: E1203 06:57:12.430326 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 03 06:57:12 crc kubenswrapper[4475]: E1203 06:57:12.430478 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6srkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-mzq79_openstack-operators(576ef351-044d-41c7-8292-4a81ff83296b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:57:13 crc kubenswrapper[4475]: I1203 06:57:13.389399 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:57:13 crc kubenswrapper[4475]: E1203 06:57:13.389566 4475 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 06:57:13 crc kubenswrapper[4475]: E1203 06:57:13.389642 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert podName:ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e nodeName:}" failed. No retries permitted until 2025-12-03 06:57:29.389623398 +0000 UTC m=+734.194521732 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert") pod "infra-operator-controller-manager-57548d458d-7q4rl" (UID: "ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e") : secret "infra-operator-webhook-server-cert" not found Dec 03 06:57:13 crc kubenswrapper[4475]: E1203 06:57:13.556059 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 03 06:57:13 crc kubenswrapper[4475]: E1203 06:57:13.556195 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m2vrz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-dxtxl_openstack-operators(82ac6fd2-3363-4fa3-901b-90781ae2db4e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:57:13 crc kubenswrapper[4475]: I1203 06:57:13.792701 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686c8cxj\" (UID: \"033c67ed-2564-4f1c-a477-ac9b901c6b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:57:13 crc kubenswrapper[4475]: I1203 06:57:13.797441 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033c67ed-2564-4f1c-a477-ac9b901c6b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686c8cxj\" (UID: \"033c67ed-2564-4f1c-a477-ac9b901c6b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:57:14 crc kubenswrapper[4475]: E1203 06:57:14.032178 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 03 06:57:14 crc kubenswrapper[4475]: E1203 06:57:14.032332 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lk4wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-bg8cq_openstack-operators(db3ddcfa-8d48-4336-bcc9-04d361dfb8e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:57:14 crc kubenswrapper[4475]: I1203 06:57:14.077276 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:57:14 crc kubenswrapper[4475]: I1203 06:57:14.095175 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:14 crc kubenswrapper[4475]: I1203 06:57:14.095229 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:14 crc kubenswrapper[4475]: E1203 06:57:14.095407 4475 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:57:14 crc kubenswrapper[4475]: E1203 06:57:14.095475 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs podName:454cf583-e2ff-4ad7-a07e-f7d4e881c67b nodeName:}" failed. No retries permitted until 2025-12-03 06:57:30.095441427 +0000 UTC m=+734.900339760 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-kk65s" (UID: "454cf583-e2ff-4ad7-a07e-f7d4e881c67b") : secret "metrics-server-cert" not found Dec 03 06:57:14 crc kubenswrapper[4475]: I1203 06:57:14.100015 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:14 crc kubenswrapper[4475]: E1203 06:57:14.481891 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 03 06:57:14 crc kubenswrapper[4475]: E1203 06:57:14.482016 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8nxr7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-7hmqk_openstack-operators(8a43730c-24cb-4520-893b-0b4c4750cce8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:57:15 crc kubenswrapper[4475]: I1203 06:57:15.271380 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj"] Dec 03 06:57:15 crc kubenswrapper[4475]: I1203 06:57:15.532036 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4" event={"ID":"542d827b-14f9-4d32-bf00-1ff3749b9cb9","Type":"ContainerStarted","Data":"60b145398c7bda2b1bc84ee45e07fa200189fbdd4a12b38fb8b49d38b256b70b"} Dec 03 06:57:15 crc kubenswrapper[4475]: I1203 06:57:15.537648 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk" event={"ID":"bb78bd55-770d-43ca-a8a2-8e22819004f9","Type":"ContainerStarted","Data":"5daed6f3b09ebf1bcb14686b054fe618d546c7a9266f6f9ba08b226889813899"} Dec 03 06:57:15 crc kubenswrapper[4475]: W1203 06:57:15.615090 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod033c67ed_2564_4f1c_a477_ac9b901c6b0d.slice/crio-e075071863c2db5756ce21273462d807b9a8bec645e6ccee2f52ac2b89c22541 WatchSource:0}: Error finding container e075071863c2db5756ce21273462d807b9a8bec645e6ccee2f52ac2b89c22541: Status 404 returned error can't find the container with id e075071863c2db5756ce21273462d807b9a8bec645e6ccee2f52ac2b89c22541 Dec 03 06:57:16 crc kubenswrapper[4475]: I1203 06:57:16.544396 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9" event={"ID":"67f4c1a1-c620-41ff-ab1a-bc603c755c6e","Type":"ContainerStarted","Data":"c22303bf3b2c511be439f5ce3995c90e5ba9e99de571970440a55dfccf3d1a5b"} Dec 03 06:57:16 crc kubenswrapper[4475]: I1203 06:57:16.553100 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc" event={"ID":"6996078d-75b0-42b4-89af-6b8f8f7be702","Type":"ContainerStarted","Data":"e691c019be3c693017a5497e650d061f3489a76060231931f7e894c25754c2f8"} Dec 03 06:57:16 crc kubenswrapper[4475]: I1203 06:57:16.555941 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t" event={"ID":"17fcf4b6-1bbf-4bda-a621-f6b563c6d7ae","Type":"ContainerStarted","Data":"adb5e68ce13be490b7927f6578c2d7872724da0d4a64fbca329a753c7203c5f9"} Dec 03 06:57:16 crc kubenswrapper[4475]: I1203 06:57:16.557342 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" event={"ID":"033c67ed-2564-4f1c-a477-ac9b901c6b0d","Type":"ContainerStarted","Data":"e075071863c2db5756ce21273462d807b9a8bec645e6ccee2f52ac2b89c22541"} Dec 03 06:57:16 crc kubenswrapper[4475]: I1203 06:57:16.558649 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj" event={"ID":"4f41afa1-2e04-40ef-99ab-51303689a06d","Type":"ContainerStarted","Data":"1640d533ee2673520ad7cc2f1177b9cc703446915e9205b198ef8eae6de498d8"} Dec 03 06:57:16 crc kubenswrapper[4475]: I1203 06:57:16.559908 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc" event={"ID":"17bf114e-8421-4eb6-a2be-466746af3f1e","Type":"ContainerStarted","Data":"38a66c50f71cc9ce3650da1a104175a7013477411883349429be3c2485d69282"} Dec 03 06:57:24 crc kubenswrapper[4475]: E1203 06:57:24.321741 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" podUID="0e10e107-5685-4ab7-a0d0-e2ede376c24e" Dec 03 06:57:24 crc kubenswrapper[4475]: E1203 06:57:24.406421 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" podUID="db3ddcfa-8d48-4336-bcc9-04d361dfb8e7" Dec 03 06:57:24 crc kubenswrapper[4475]: E1203 06:57:24.535359 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" podUID="82ac6fd2-3363-4fa3-901b-90781ae2db4e" Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.634642 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" event={"ID":"abd2d6fd-7465-4606-aba4-f6e6501e6a39","Type":"ContainerStarted","Data":"33325988545fa5b921059b515173f2bb509ffbfe751aa11b150d3438d3629c9b"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.642095 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" event={"ID":"db3ddcfa-8d48-4336-bcc9-04d361dfb8e7","Type":"ContainerStarted","Data":"5a273434c057da15bad168c1ca214161ffd8f362ccfa2036cc0c50b50505e0ac"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.645707 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" event={"ID":"de989a5f-3b3a-4576-bc74-fb7484356b12","Type":"ContainerStarted","Data":"56ba987a3276aa4f6034de4a979a3a6955a7e0f61711d3de3c1025560346a042"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.648740 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" event={"ID":"fede2bdc-4902-40e6-b0cf-9827f5bbe268","Type":"ContainerStarted","Data":"f6c41793f088faaa852524a4eca2994d2c9e6965544b4001d61a3de1f14b22fe"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.652487 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" event={"ID":"82ac6fd2-3363-4fa3-901b-90781ae2db4e","Type":"ContainerStarted","Data":"b7b68710f84efb60c68c12c792e4985ed0a6bb394050018f6b9f273ed89ac692"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.671135 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6" event={"ID":"5afd068b-0798-44fc-a0c3-498e5d3803df","Type":"ContainerStarted","Data":"bbc418602ab69b886e219d14ae75c898f6bc2e93b7bbc770fcb5515916f65697"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.684786 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" event={"ID":"31bcdf0e-dd3a-4219-8ca4-15be9d19172f","Type":"ContainerStarted","Data":"64989cdeefeaf9017aadc944850bd78c4b2d7552064e215f342bfc889783a20c"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.695583 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" event={"ID":"d569ccab-198f-441f-add9-b7c06254c8d0","Type":"ContainerStarted","Data":"fa248e94f4f4eea9e215b66c8ba14ed7555dc02d9ba91af43f3447c90cd17ccd"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.715518 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc" event={"ID":"17bf114e-8421-4eb6-a2be-466746af3f1e","Type":"ContainerStarted","Data":"2863abc46cc7f9d30809b1245fd405dd1a671fd6d5638670151e49c582c26cec"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.716097 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc" Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.719891 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" event={"ID":"033c67ed-2564-4f1c-a477-ac9b901c6b0d","Type":"ContainerStarted","Data":"9c9dae24cd091e28766e63539ccdce0d6ddab69ae0650cb57af028669670eb7b"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.722525 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc" Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.738580 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" event={"ID":"0e10e107-5685-4ab7-a0d0-e2ede376c24e","Type":"ContainerStarted","Data":"21a022f9546e1a51d7466abd05e1725b66471e760fca2c3d9452e95025731021"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.762616 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" event={"ID":"67c05515-1eb8-4c77-acbc-70ea437f1313","Type":"ContainerStarted","Data":"92708cf3fb32dd1d0638c4ff866df17c1c710d5324c67b91d12ae8d30c97ee2e"} Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.769249 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lhbp6" podStartSLOduration=2.087920462 podStartE2EDuration="26.76921159s" podCreationTimestamp="2025-12-03 06:56:58 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.355172295 +0000 UTC m=+704.160070629" lastFinishedPulling="2025-12-03 06:57:24.036463423 +0000 UTC m=+728.841361757" observedRunningTime="2025-12-03 06:57:24.767024957 +0000 UTC m=+729.571923292" watchObservedRunningTime="2025-12-03 06:57:24.76921159 +0000 UTC m=+729.574109914" Dec 03 06:57:24 crc kubenswrapper[4475]: I1203 06:57:24.797903 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w8zbc" podStartSLOduration=2.564459545 podStartE2EDuration="27.797881602s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:58.796742619 +0000 UTC m=+703.601640953" lastFinishedPulling="2025-12-03 06:57:24.030164675 +0000 UTC m=+728.835063010" observedRunningTime="2025-12-03 06:57:24.797347537 +0000 UTC m=+729.602245871" watchObservedRunningTime="2025-12-03 06:57:24.797881602 +0000 UTC m=+729.602779936" Dec 03 06:57:24 crc kubenswrapper[4475]: E1203 06:57:24.944728 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" podUID="3ad9f39b-3080-4828-aa52-20410ce25c66" Dec 03 06:57:25 crc kubenswrapper[4475]: E1203 06:57:25.344170 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" podUID="8a43730c-24cb-4520-893b-0b4c4750cce8" Dec 03 06:57:25 crc kubenswrapper[4475]: E1203 06:57:25.396875 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" podUID="576ef351-044d-41c7-8292-4a81ff83296b" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.782683 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" event={"ID":"033c67ed-2564-4f1c-a477-ac9b901c6b0d","Type":"ContainerStarted","Data":"b90a09b70b1e58110b7a4ed551883a0cf4ef6f812ac80bb8df8d17cb7303e9d2"} Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.783422 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.786924 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" event={"ID":"d569ccab-198f-441f-add9-b7c06254c8d0","Type":"ContainerStarted","Data":"a69b0347e5feb9506826fc38f3b7d6c87080d805267d39c9be77ff8cd28aada6"} Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.787281 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.792267 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj" event={"ID":"4f41afa1-2e04-40ef-99ab-51303689a06d","Type":"ContainerStarted","Data":"ebed86e7d1660b16d670c67b550a89f3256aa37917d78167998c7948e6f29d30"} Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.792954 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.797702 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" event={"ID":"fede2bdc-4902-40e6-b0cf-9827f5bbe268","Type":"ContainerStarted","Data":"85e1edad9aa451d1eabc5d82ae151e9eed6bec752e7e0ff2cdf2d1a7cd2cd445"} Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.798094 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.809911 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" event={"ID":"3ad9f39b-3080-4828-aa52-20410ce25c66","Type":"ContainerStarted","Data":"c40052446da5ebd15a580088ad047d4d2d41a563059989e8d6c895f82b592fc2"} Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.812880 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.842983 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" event={"ID":"67c05515-1eb8-4c77-acbc-70ea437f1313","Type":"ContainerStarted","Data":"d6fa0332d4261c9df34dcf9bc96761d69e4f922ae939b0d2227dd07f1c1e444d"} Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.843286 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.858425 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" podStartSLOduration=20.49809282 podStartE2EDuration="28.858396447s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:57:15.624588314 +0000 UTC m=+720.429486648" lastFinishedPulling="2025-12-03 06:57:23.984891942 +0000 UTC m=+728.789790275" observedRunningTime="2025-12-03 06:57:25.847260859 +0000 UTC m=+730.652159203" watchObservedRunningTime="2025-12-03 06:57:25.858396447 +0000 UTC m=+730.663294781" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.863664 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" event={"ID":"de989a5f-3b3a-4576-bc74-fb7484356b12","Type":"ContainerStarted","Data":"cae0813bfb17ff5a49a14075860c84c1b656b5241611dc0616b4c23dfd44d60d"} Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.864036 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.870736 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hq2sj" podStartSLOduration=3.6996826609999998 podStartE2EDuration="28.870721101s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:58.85909721 +0000 UTC m=+703.663995545" lastFinishedPulling="2025-12-03 06:57:24.030135651 +0000 UTC m=+728.835033985" observedRunningTime="2025-12-03 06:57:25.868353518 +0000 UTC m=+730.673251851" watchObservedRunningTime="2025-12-03 06:57:25.870721101 +0000 UTC m=+730.675619435" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.884669 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9" event={"ID":"67f4c1a1-c620-41ff-ab1a-bc603c755c6e","Type":"ContainerStarted","Data":"ef2b90475c5a8d4329f93a4d49f7b716968e4309b779ee8d03c649d4590aed9d"} Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.884956 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.887968 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.910664 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" podStartSLOduration=4.252232523 podStartE2EDuration="28.910654051s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.353022581 +0000 UTC m=+704.157920916" lastFinishedPulling="2025-12-03 06:57:24.01144411 +0000 UTC m=+728.816342444" observedRunningTime="2025-12-03 06:57:25.897869701 +0000 UTC m=+730.702768045" watchObservedRunningTime="2025-12-03 06:57:25.910654051 +0000 UTC m=+730.715552384" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.920196 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk" event={"ID":"bb78bd55-770d-43ca-a8a2-8e22819004f9","Type":"ContainerStarted","Data":"d2805e3f8216a9ae8b7bba44641d9f7521056a0240641f8292a67059f85604d0"} Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.921017 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.927695 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" podStartSLOduration=4.328502534 podStartE2EDuration="28.927678665s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.352104966 +0000 UTC m=+704.157003299" lastFinishedPulling="2025-12-03 06:57:23.951281096 +0000 UTC m=+728.756179430" observedRunningTime="2025-12-03 06:57:25.925514504 +0000 UTC m=+730.730412858" watchObservedRunningTime="2025-12-03 06:57:25.927678665 +0000 UTC m=+730.732576999" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.930947 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.959584 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t" event={"ID":"17fcf4b6-1bbf-4bda-a621-f6b563c6d7ae","Type":"ContainerStarted","Data":"1db68cccb1790bce82988d429541ec0ad7caad64460265b0b46673aa6bedcc74"} Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.961109 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.961778 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.979152 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" podStartSLOduration=4.336024625 podStartE2EDuration="28.979138968s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.356474485 +0000 UTC m=+704.161372819" lastFinishedPulling="2025-12-03 06:57:23.999588829 +0000 UTC m=+728.804487162" observedRunningTime="2025-12-03 06:57:25.975937666 +0000 UTC m=+730.780836000" watchObservedRunningTime="2025-12-03 06:57:25.979138968 +0000 UTC m=+730.784037302" Dec 03 06:57:25 crc kubenswrapper[4475]: I1203 06:57:25.983204 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" event={"ID":"8a43730c-24cb-4520-893b-0b4c4750cce8","Type":"ContainerStarted","Data":"3da51b4760a3149611dc4afd77a22566321752804c49c4565b3e5239eca5d047"} Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.001083 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4" event={"ID":"542d827b-14f9-4d32-bf00-1ff3749b9cb9","Type":"ContainerStarted","Data":"57f818790fb98509f7b5cad52a64368610eafa49c69448f6cdee9b85de2e00b0"} Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.001980 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.006702 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ddwp9" podStartSLOduration=4.225861208 podStartE2EDuration="29.006691107s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.309086626 +0000 UTC m=+704.113984961" lastFinishedPulling="2025-12-03 06:57:24.089916526 +0000 UTC m=+728.894814860" observedRunningTime="2025-12-03 06:57:26.00365203 +0000 UTC m=+730.808550374" watchObservedRunningTime="2025-12-03 06:57:26.006691107 +0000 UTC m=+730.811589440" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.008287 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.041527 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-pfb7t" podStartSLOduration=3.611236004 podStartE2EDuration="29.041495919s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:58.599607991 +0000 UTC m=+703.404506325" lastFinishedPulling="2025-12-03 06:57:24.029867907 +0000 UTC m=+728.834766240" observedRunningTime="2025-12-03 06:57:26.027831254 +0000 UTC m=+730.832729588" watchObservedRunningTime="2025-12-03 06:57:26.041495919 +0000 UTC m=+730.846394253" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.044687 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" event={"ID":"abd2d6fd-7465-4606-aba4-f6e6501e6a39","Type":"ContainerStarted","Data":"a339e791d1ae2536c8467a116456052572c7f5f8ccf4e3bf741ca3b9f4c3ca33"} Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.045529 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.062965 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" event={"ID":"576ef351-044d-41c7-8292-4a81ff83296b","Type":"ContainerStarted","Data":"20cf5a61f9429b40a641d92c4966c60a6ae7582dcc7edba796511272285c0b52"} Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.067107 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc" event={"ID":"6996078d-75b0-42b4-89af-6b8f8f7be702","Type":"ContainerStarted","Data":"888b67162a72248c45fac2bda5230d22b498d4d9fdebd29b75b7f6963596d0f4"} Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.067670 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.070639 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" event={"ID":"31bcdf0e-dd3a-4219-8ca4-15be9d19172f","Type":"ContainerStarted","Data":"a988fe08e6a1b4bc27b140bee871ebbda721ab9a8e959f8fab63178da3f85b04"} Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.070661 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.075903 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kmfjk" podStartSLOduration=4.391138745 podStartE2EDuration="29.075886352s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.326312159 +0000 UTC m=+704.131210493" lastFinishedPulling="2025-12-03 06:57:24.011059766 +0000 UTC m=+728.815958100" observedRunningTime="2025-12-03 06:57:26.063431002 +0000 UTC m=+730.868329335" watchObservedRunningTime="2025-12-03 06:57:26.075886352 +0000 UTC m=+730.880784685" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.090597 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.115587 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" podStartSLOduration=4.508809866 podStartE2EDuration="29.115565542s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.327611112 +0000 UTC m=+704.132509446" lastFinishedPulling="2025-12-03 06:57:23.934366787 +0000 UTC m=+728.739265122" observedRunningTime="2025-12-03 06:57:26.105225923 +0000 UTC m=+730.910124257" watchObservedRunningTime="2025-12-03 06:57:26.115565542 +0000 UTC m=+730.920463877" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.139885 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-8fnp4" podStartSLOduration=4.421733948 podStartE2EDuration="29.139861796s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.309479034 +0000 UTC m=+704.114377369" lastFinishedPulling="2025-12-03 06:57:24.027606883 +0000 UTC m=+728.832505217" observedRunningTime="2025-12-03 06:57:26.138285372 +0000 UTC m=+730.943183705" watchObservedRunningTime="2025-12-03 06:57:26.139861796 +0000 UTC m=+730.944760130" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.181064 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" podStartSLOduration=4.933544132 podStartE2EDuration="29.181039367s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.337624397 +0000 UTC m=+704.142522731" lastFinishedPulling="2025-12-03 06:57:23.585119632 +0000 UTC m=+728.390017966" observedRunningTime="2025-12-03 06:57:26.179098976 +0000 UTC m=+730.983997320" watchObservedRunningTime="2025-12-03 06:57:26.181039367 +0000 UTC m=+730.985937701" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.233544 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" podStartSLOduration=12.939865306 podStartE2EDuration="29.23352034s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.349913964 +0000 UTC m=+704.154812298" lastFinishedPulling="2025-12-03 06:57:15.643568999 +0000 UTC m=+720.448467332" observedRunningTime="2025-12-03 06:57:26.231232186 +0000 UTC m=+731.036130509" watchObservedRunningTime="2025-12-03 06:57:26.23352034 +0000 UTC m=+731.038418674" Dec 03 06:57:26 crc kubenswrapper[4475]: I1203 06:57:26.256335 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-2s7sc" podStartSLOduration=4.469594455 podStartE2EDuration="29.256320189s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.242707345 +0000 UTC m=+704.047605679" lastFinishedPulling="2025-12-03 06:57:24.029433079 +0000 UTC m=+728.834331413" observedRunningTime="2025-12-03 06:57:26.254469729 +0000 UTC m=+731.059368062" watchObservedRunningTime="2025-12-03 06:57:26.256320189 +0000 UTC m=+731.061218523" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.078200 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" event={"ID":"3ad9f39b-3080-4828-aa52-20410ce25c66","Type":"ContainerStarted","Data":"8386faa43c0f5c1034f9963fb7874eed96486a5f0fc1b874ccd14c0431898a17"} Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.078492 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.080442 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" event={"ID":"8a43730c-24cb-4520-893b-0b4c4750cce8","Type":"ContainerStarted","Data":"822f3f75779b87c5bea47ca2a20cebd32f58b911017ac946a8c8fba8476dfc0c"} Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.080503 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.082375 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" event={"ID":"82ac6fd2-3363-4fa3-901b-90781ae2db4e","Type":"ContainerStarted","Data":"d4cd8c43df436954039be0ac8de0b6f5bd9ca5fcb4069420fff29a8078742745"} Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.086816 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" event={"ID":"db3ddcfa-8d48-4336-bcc9-04d361dfb8e7","Type":"ContainerStarted","Data":"529a416514041c12a32cbcced8869edb6b6cb76028f021d48f23fba04afbd31d"} Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.087513 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.093710 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" event={"ID":"0e10e107-5685-4ab7-a0d0-e2ede376c24e","Type":"ContainerStarted","Data":"0f919a9a3515673aa66b2d31014725e0df1f1f0ad36994b3825de95718a1348d"} Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.094368 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.101794 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" event={"ID":"576ef351-044d-41c7-8292-4a81ff83296b","Type":"ContainerStarted","Data":"e37138c858a12b18763453d69fffbdec522f691a1bf66c36363e6b8ac1afb4f4"} Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.101921 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.116636 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" podStartSLOduration=2.948529288 podStartE2EDuration="30.116601981s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.264223668 +0000 UTC m=+704.069122002" lastFinishedPulling="2025-12-03 06:57:26.432296362 +0000 UTC m=+731.237194695" observedRunningTime="2025-12-03 06:57:27.114921551 +0000 UTC m=+731.919819885" watchObservedRunningTime="2025-12-03 06:57:27.116601981 +0000 UTC m=+731.921500305" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.170313 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" podStartSLOduration=3.454344769 podStartE2EDuration="30.170288252s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:58.768585855 +0000 UTC m=+703.573484189" lastFinishedPulling="2025-12-03 06:57:25.484529339 +0000 UTC m=+730.289427672" observedRunningTime="2025-12-03 06:57:27.158107178 +0000 UTC m=+731.963005512" watchObservedRunningTime="2025-12-03 06:57:27.170288252 +0000 UTC m=+731.975186585" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.235594 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" podStartSLOduration=3.617063083 podStartE2EDuration="30.235573902s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:58.866065498 +0000 UTC m=+703.670963832" lastFinishedPulling="2025-12-03 06:57:25.484576327 +0000 UTC m=+730.289474651" observedRunningTime="2025-12-03 06:57:27.225936984 +0000 UTC m=+732.030835328" watchObservedRunningTime="2025-12-03 06:57:27.235573902 +0000 UTC m=+732.040472236" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.272181 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" podStartSLOduration=2.9785410260000003 podStartE2EDuration="30.272161047s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.228550865 +0000 UTC m=+704.033449199" lastFinishedPulling="2025-12-03 06:57:26.522170886 +0000 UTC m=+731.327069220" observedRunningTime="2025-12-03 06:57:27.267586081 +0000 UTC m=+732.072484415" watchObservedRunningTime="2025-12-03 06:57:27.272161047 +0000 UTC m=+732.077059381" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.326883 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" podStartSLOduration=3.9790587840000002 podStartE2EDuration="30.326868538s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.302905069 +0000 UTC m=+704.107803403" lastFinishedPulling="2025-12-03 06:57:25.650714823 +0000 UTC m=+730.455613157" observedRunningTime="2025-12-03 06:57:27.300669976 +0000 UTC m=+732.105568310" watchObservedRunningTime="2025-12-03 06:57:27.326868538 +0000 UTC m=+732.131766872" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.330275 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" podStartSLOduration=2.925250853 podStartE2EDuration="30.33026645s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:59.206065821 +0000 UTC m=+704.010964155" lastFinishedPulling="2025-12-03 06:57:26.611081417 +0000 UTC m=+731.415979752" observedRunningTime="2025-12-03 06:57:27.325858838 +0000 UTC m=+732.130757172" watchObservedRunningTime="2025-12-03 06:57:27.33026645 +0000 UTC m=+732.135164784" Dec 03 06:57:27 crc kubenswrapper[4475]: I1203 06:57:27.754473 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" Dec 03 06:57:29 crc kubenswrapper[4475]: I1203 06:57:29.423014 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:57:29 crc kubenswrapper[4475]: I1203 06:57:29.432538 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e-cert\") pod \"infra-operator-controller-manager-57548d458d-7q4rl\" (UID: \"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:57:29 crc kubenswrapper[4475]: I1203 06:57:29.673197 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rbpbs" Dec 03 06:57:29 crc kubenswrapper[4475]: I1203 06:57:29.681991 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:57:30 crc kubenswrapper[4475]: I1203 06:57:30.059033 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl"] Dec 03 06:57:30 crc kubenswrapper[4475]: W1203 06:57:30.063637 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea572cb7_8b01_4dcf_b3a4_617f79c6ef0e.slice/crio-ad8c48305b3421ef525415482c187dc8bd29fdf32f43b100bccac9a695a4877e WatchSource:0}: Error finding container ad8c48305b3421ef525415482c187dc8bd29fdf32f43b100bccac9a695a4877e: Status 404 returned error can't find the container with id ad8c48305b3421ef525415482c187dc8bd29fdf32f43b100bccac9a695a4877e Dec 03 06:57:30 crc kubenswrapper[4475]: I1203 06:57:30.117477 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" event={"ID":"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e","Type":"ContainerStarted","Data":"ad8c48305b3421ef525415482c187dc8bd29fdf32f43b100bccac9a695a4877e"} Dec 03 06:57:30 crc kubenswrapper[4475]: I1203 06:57:30.135617 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:30 crc kubenswrapper[4475]: I1203 06:57:30.142951 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/454cf583-e2ff-4ad7-a07e-f7d4e881c67b-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-kk65s\" (UID: \"454cf583-e2ff-4ad7-a07e-f7d4e881c67b\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:30 crc kubenswrapper[4475]: I1203 06:57:30.371353 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wjzk4" Dec 03 06:57:30 crc kubenswrapper[4475]: I1203 06:57:30.380841 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:30 crc kubenswrapper[4475]: I1203 06:57:30.763856 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s"] Dec 03 06:57:31 crc kubenswrapper[4475]: I1203 06:57:31.134527 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" event={"ID":"454cf583-e2ff-4ad7-a07e-f7d4e881c67b","Type":"ContainerStarted","Data":"4bbd410e40b31dc5c73640ea49467275c871f7fd74c3a2e2b76dec5af7e2e068"} Dec 03 06:57:31 crc kubenswrapper[4475]: I1203 06:57:31.134596 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" event={"ID":"454cf583-e2ff-4ad7-a07e-f7d4e881c67b","Type":"ContainerStarted","Data":"06a4d755463a063f1cb7cae78a852182910179ad9efca86353a65e311e952030"} Dec 03 06:57:31 crc kubenswrapper[4475]: I1203 06:57:31.170070 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" podStartSLOduration=33.170052525 podStartE2EDuration="33.170052525s" podCreationTimestamp="2025-12-03 06:56:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:57:31.160115072 +0000 UTC m=+735.965013405" watchObservedRunningTime="2025-12-03 06:57:31.170052525 +0000 UTC m=+735.974950859" Dec 03 06:57:32 crc kubenswrapper[4475]: I1203 06:57:32.142385 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" event={"ID":"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e","Type":"ContainerStarted","Data":"439e3a467ec52332ba10ca06f209b700e39ae979ed32420b075deab208fd854d"} Dec 03 06:57:32 crc kubenswrapper[4475]: I1203 06:57:32.142434 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" event={"ID":"ea572cb7-8b01-4dcf-b3a4-617f79c6ef0e","Type":"ContainerStarted","Data":"2a517aaff81578c56d5ec6434550bc7a234bad4cafa84516afdd086512d59d41"} Dec 03 06:57:32 crc kubenswrapper[4475]: I1203 06:57:32.143931 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:32 crc kubenswrapper[4475]: I1203 06:57:32.159128 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" podStartSLOduration=33.571001407 podStartE2EDuration="35.159111964s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:57:30.066823059 +0000 UTC m=+734.871721392" lastFinishedPulling="2025-12-03 06:57:31.654933614 +0000 UTC m=+736.459831949" observedRunningTime="2025-12-03 06:57:32.156839129 +0000 UTC m=+736.961737463" watchObservedRunningTime="2025-12-03 06:57:32.159111964 +0000 UTC m=+736.964010298" Dec 03 06:57:33 crc kubenswrapper[4475]: I1203 06:57:33.150251 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:57:34 crc kubenswrapper[4475]: I1203 06:57:34.082587 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686c8cxj" Dec 03 06:57:37 crc kubenswrapper[4475]: I1203 06:57:37.756426 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-dxtxl" Dec 03 06:57:37 crc kubenswrapper[4475]: I1203 06:57:37.859292 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wkdz6" Dec 03 06:57:37 crc kubenswrapper[4475]: I1203 06:57:37.904793 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hjzmk" Dec 03 06:57:37 crc kubenswrapper[4475]: I1203 06:57:37.939826 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bg8cq" Dec 03 06:57:37 crc kubenswrapper[4475]: I1203 06:57:37.970411 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xffjq" Dec 03 06:57:38 crc kubenswrapper[4475]: I1203 06:57:38.095207 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hmqk" Dec 03 06:57:38 crc kubenswrapper[4475]: I1203 06:57:38.135423 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-mzq79" Dec 03 06:57:38 crc kubenswrapper[4475]: I1203 06:57:38.164993 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qj57b" Dec 03 06:57:38 crc kubenswrapper[4475]: I1203 06:57:38.323666 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vgtpk" Dec 03 06:57:38 crc kubenswrapper[4475]: I1203 06:57:38.381275 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zq8pq" Dec 03 06:57:38 crc kubenswrapper[4475]: I1203 06:57:38.425641 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-csbzh" Dec 03 06:57:38 crc kubenswrapper[4475]: I1203 06:57:38.504551 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-kmdw2" Dec 03 06:57:39 crc kubenswrapper[4475]: I1203 06:57:39.687760 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-7q4rl" Dec 03 06:57:40 crc kubenswrapper[4475]: I1203 06:57:40.386342 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-kk65s" Dec 03 06:57:42 crc kubenswrapper[4475]: I1203 06:57:42.591466 4475 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.727877 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6569bcd497-9vjn4"] Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.729148 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.731257 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.731530 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.731589 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-nppln" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.731642 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.748230 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6569bcd497-9vjn4"] Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.768014 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/194314aa-621d-43ab-94ba-62ba52e1208a-config\") pod \"dnsmasq-dns-6569bcd497-9vjn4\" (UID: \"194314aa-621d-43ab-94ba-62ba52e1208a\") " pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.768061 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhfhk\" (UniqueName: \"kubernetes.io/projected/194314aa-621d-43ab-94ba-62ba52e1208a-kube-api-access-xhfhk\") pod \"dnsmasq-dns-6569bcd497-9vjn4\" (UID: \"194314aa-621d-43ab-94ba-62ba52e1208a\") " pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.792915 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d86d69cf7-ltbwj"] Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.793929 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.796345 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.806465 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d86d69cf7-ltbwj"] Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.869035 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9whs9\" (UniqueName: \"kubernetes.io/projected/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-kube-api-access-9whs9\") pod \"dnsmasq-dns-7d86d69cf7-ltbwj\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.869125 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-config\") pod \"dnsmasq-dns-7d86d69cf7-ltbwj\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.869156 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/194314aa-621d-43ab-94ba-62ba52e1208a-config\") pod \"dnsmasq-dns-6569bcd497-9vjn4\" (UID: \"194314aa-621d-43ab-94ba-62ba52e1208a\") " pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.869175 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-dns-svc\") pod \"dnsmasq-dns-7d86d69cf7-ltbwj\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.869205 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhfhk\" (UniqueName: \"kubernetes.io/projected/194314aa-621d-43ab-94ba-62ba52e1208a-kube-api-access-xhfhk\") pod \"dnsmasq-dns-6569bcd497-9vjn4\" (UID: \"194314aa-621d-43ab-94ba-62ba52e1208a\") " pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.870562 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/194314aa-621d-43ab-94ba-62ba52e1208a-config\") pod \"dnsmasq-dns-6569bcd497-9vjn4\" (UID: \"194314aa-621d-43ab-94ba-62ba52e1208a\") " pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.884590 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhfhk\" (UniqueName: \"kubernetes.io/projected/194314aa-621d-43ab-94ba-62ba52e1208a-kube-api-access-xhfhk\") pod \"dnsmasq-dns-6569bcd497-9vjn4\" (UID: \"194314aa-621d-43ab-94ba-62ba52e1208a\") " pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.970603 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-dns-svc\") pod \"dnsmasq-dns-7d86d69cf7-ltbwj\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.970690 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9whs9\" (UniqueName: \"kubernetes.io/projected/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-kube-api-access-9whs9\") pod \"dnsmasq-dns-7d86d69cf7-ltbwj\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.970773 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-config\") pod \"dnsmasq-dns-7d86d69cf7-ltbwj\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.971470 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-config\") pod \"dnsmasq-dns-7d86d69cf7-ltbwj\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.971643 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-dns-svc\") pod \"dnsmasq-dns-7d86d69cf7-ltbwj\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:57:55 crc kubenswrapper[4475]: I1203 06:57:55.991809 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9whs9\" (UniqueName: \"kubernetes.io/projected/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-kube-api-access-9whs9\") pod \"dnsmasq-dns-7d86d69cf7-ltbwj\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:57:56 crc kubenswrapper[4475]: I1203 06:57:56.041330 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" Dec 03 06:57:56 crc kubenswrapper[4475]: I1203 06:57:56.110703 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:57:56 crc kubenswrapper[4475]: I1203 06:57:56.428439 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6569bcd497-9vjn4"] Dec 03 06:57:56 crc kubenswrapper[4475]: I1203 06:57:56.508400 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d86d69cf7-ltbwj"] Dec 03 06:57:57 crc kubenswrapper[4475]: I1203 06:57:57.273865 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" event={"ID":"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec","Type":"ContainerStarted","Data":"e495253001e00b5389debf1d2b73450c32c1fd3655c06e51808a58cc661116a2"} Dec 03 06:57:57 crc kubenswrapper[4475]: I1203 06:57:57.276176 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" event={"ID":"194314aa-621d-43ab-94ba-62ba52e1208a","Type":"ContainerStarted","Data":"5ed97d6057b9f8842495000012233e4b113eb39e8945fb05e0abfd660d6079f5"} Dec 03 06:57:58 crc kubenswrapper[4475]: I1203 06:57:58.899118 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6569bcd497-9vjn4"] Dec 03 06:57:58 crc kubenswrapper[4475]: I1203 06:57:58.928778 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55746cbb65-vl4j9"] Dec 03 06:57:58 crc kubenswrapper[4475]: I1203 06:57:58.934437 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:57:58 crc kubenswrapper[4475]: I1203 06:57:58.944015 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55746cbb65-vl4j9"] Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.010090 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjdm9\" (UniqueName: \"kubernetes.io/projected/4cd4f066-5f36-4e37-963d-8aa66bf9267c-kube-api-access-tjdm9\") pod \"dnsmasq-dns-55746cbb65-vl4j9\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.010154 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-dns-svc\") pod \"dnsmasq-dns-55746cbb65-vl4j9\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.010349 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-config\") pod \"dnsmasq-dns-55746cbb65-vl4j9\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.112172 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjdm9\" (UniqueName: \"kubernetes.io/projected/4cd4f066-5f36-4e37-963d-8aa66bf9267c-kube-api-access-tjdm9\") pod \"dnsmasq-dns-55746cbb65-vl4j9\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.112264 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-dns-svc\") pod \"dnsmasq-dns-55746cbb65-vl4j9\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.112299 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-config\") pod \"dnsmasq-dns-55746cbb65-vl4j9\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.113165 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-config\") pod \"dnsmasq-dns-55746cbb65-vl4j9\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.113219 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-dns-svc\") pod \"dnsmasq-dns-55746cbb65-vl4j9\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.141901 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjdm9\" (UniqueName: \"kubernetes.io/projected/4cd4f066-5f36-4e37-963d-8aa66bf9267c-kube-api-access-tjdm9\") pod \"dnsmasq-dns-55746cbb65-vl4j9\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.183307 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d86d69cf7-ltbwj"] Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.241482 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59654bbc49-vdxqb"] Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.244412 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.258889 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.267643 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59654bbc49-vdxqb"] Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.316488 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25968\" (UniqueName: \"kubernetes.io/projected/8b92cc5c-e27e-49bc-949c-7c98a208180a-kube-api-access-25968\") pod \"dnsmasq-dns-59654bbc49-vdxqb\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.316590 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-dns-svc\") pod \"dnsmasq-dns-59654bbc49-vdxqb\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.316650 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-config\") pod \"dnsmasq-dns-59654bbc49-vdxqb\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.417773 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-dns-svc\") pod \"dnsmasq-dns-59654bbc49-vdxqb\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.417853 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-config\") pod \"dnsmasq-dns-59654bbc49-vdxqb\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.417871 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25968\" (UniqueName: \"kubernetes.io/projected/8b92cc5c-e27e-49bc-949c-7c98a208180a-kube-api-access-25968\") pod \"dnsmasq-dns-59654bbc49-vdxqb\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.418792 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-dns-svc\") pod \"dnsmasq-dns-59654bbc49-vdxqb\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.419302 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-config\") pod \"dnsmasq-dns-59654bbc49-vdxqb\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.461210 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25968\" (UniqueName: \"kubernetes.io/projected/8b92cc5c-e27e-49bc-949c-7c98a208180a-kube-api-access-25968\") pod \"dnsmasq-dns-59654bbc49-vdxqb\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.565715 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:57:59 crc kubenswrapper[4475]: I1203 06:57:59.961823 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55746cbb65-vl4j9"] Dec 03 06:57:59 crc kubenswrapper[4475]: W1203 06:57:59.963663 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cd4f066_5f36_4e37_963d_8aa66bf9267c.slice/crio-3fc5092e54690e5bbd984dda1c934641ce9ca2dcba9ed8226db4e08e3f012107 WatchSource:0}: Error finding container 3fc5092e54690e5bbd984dda1c934641ce9ca2dcba9ed8226db4e08e3f012107: Status 404 returned error can't find the container with id 3fc5092e54690e5bbd984dda1c934641ce9ca2dcba9ed8226db4e08e3f012107 Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.058378 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.059466 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.064893 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8rdmk" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.065040 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.065164 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.065825 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.066010 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.066900 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.068404 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.080111 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.117505 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59654bbc49-vdxqb"] Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.133297 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.133337 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.133358 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.133377 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/386645cd-74e5-45bc-b3e4-0a326e5349f1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.133441 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.133477 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtn6k\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-kube-api-access-dtn6k\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.133503 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-config-data\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.133522 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.133573 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.133603 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/386645cd-74e5-45bc-b3e4-0a326e5349f1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.133620 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.234684 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.234737 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.234757 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/386645cd-74e5-45bc-b3e4-0a326e5349f1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.234778 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.234803 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.234851 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.234870 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/386645cd-74e5-45bc-b3e4-0a326e5349f1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.234893 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.234910 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtn6k\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-kube-api-access-dtn6k\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.234937 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-config-data\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.234972 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.235823 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.236237 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.236624 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.236628 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.236999 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.237177 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-config-data\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.241059 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.244009 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.245642 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/386645cd-74e5-45bc-b3e4-0a326e5349f1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.246197 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/386645cd-74e5-45bc-b3e4-0a326e5349f1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.251033 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtn6k\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-kube-api-access-dtn6k\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.257599 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.304649 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" event={"ID":"4cd4f066-5f36-4e37-963d-8aa66bf9267c","Type":"ContainerStarted","Data":"3fc5092e54690e5bbd984dda1c934641ce9ca2dcba9ed8226db4e08e3f012107"} Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.308492 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" event={"ID":"8b92cc5c-e27e-49bc-949c-7c98a208180a","Type":"ContainerStarted","Data":"6523189721d8bc441d5f0e45acf28e354164feaf541bd6086e7e1def16eb10e8"} Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.385319 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.434014 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.435223 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.443571 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.444928 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8m6hj" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.445300 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.445483 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.445610 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.445743 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.445906 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.448780 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.547493 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6447be14-8b0d-4514-a7c2-53da228c70c2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.547746 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdhdr\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-kube-api-access-jdhdr\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.547787 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.547812 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.547853 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.547872 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6447be14-8b0d-4514-a7c2-53da228c70c2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.547915 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.547976 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.548014 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.548033 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.548065 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.651271 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.651333 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.651387 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.651408 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.651503 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.651542 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6447be14-8b0d-4514-a7c2-53da228c70c2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.651594 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdhdr\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-kube-api-access-jdhdr\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.651620 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.651691 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.651724 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.651749 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6447be14-8b0d-4514-a7c2-53da228c70c2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.652857 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.652966 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.653137 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.653364 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.654403 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.654546 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.657521 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6447be14-8b0d-4514-a7c2-53da228c70c2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.667937 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.670989 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6447be14-8b0d-4514-a7c2-53da228c70c2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.677179 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.687985 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdhdr\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-kube-api-access-jdhdr\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.714890 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.775050 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:58:00 crc kubenswrapper[4475]: I1203 06:58:00.862726 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 06:58:00 crc kubenswrapper[4475]: W1203 06:58:00.892636 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod386645cd_74e5_45bc_b3e4_0a326e5349f1.slice/crio-d9276793d2a0033e80141eadec5bd4c55cf789534249ec3e081fe04916a19426 WatchSource:0}: Error finding container d9276793d2a0033e80141eadec5bd4c55cf789534249ec3e081fe04916a19426: Status 404 returned error can't find the container with id d9276793d2a0033e80141eadec5bd4c55cf789534249ec3e081fe04916a19426 Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.190506 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 06:58:01 crc kubenswrapper[4475]: W1203 06:58:01.204724 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6447be14_8b0d_4514_a7c2_53da228c70c2.slice/crio-6c51c5714877dda187663bf6897fc2521befecff47c42e29fac8a5564fc90959 WatchSource:0}: Error finding container 6c51c5714877dda187663bf6897fc2521befecff47c42e29fac8a5564fc90959: Status 404 returned error can't find the container with id 6c51c5714877dda187663bf6897fc2521befecff47c42e29fac8a5564fc90959 Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.321369 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6447be14-8b0d-4514-a7c2-53da228c70c2","Type":"ContainerStarted","Data":"6c51c5714877dda187663bf6897fc2521befecff47c42e29fac8a5564fc90959"} Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.322727 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"386645cd-74e5-45bc-b3e4-0a326e5349f1","Type":"ContainerStarted","Data":"d9276793d2a0033e80141eadec5bd4c55cf789534249ec3e081fe04916a19426"} Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.700930 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.702262 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.706204 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.706491 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.706625 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-x2s2d" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.707567 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.709850 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.712735 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.877018 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47365d7f-1974-40d3-b6de-c033b41729ba-operator-scripts\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.877222 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47365d7f-1974-40d3-b6de-c033b41729ba-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.877270 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47365d7f-1974-40d3-b6de-c033b41729ba-config-data-generated\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.877355 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47365d7f-1974-40d3-b6de-c033b41729ba-config-data-default\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.877381 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47365d7f-1974-40d3-b6de-c033b41729ba-kolla-config\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.877583 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrgr\" (UniqueName: \"kubernetes.io/projected/47365d7f-1974-40d3-b6de-c033b41729ba-kube-api-access-mvrgr\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.877665 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.877722 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47365d7f-1974-40d3-b6de-c033b41729ba-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.979726 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47365d7f-1974-40d3-b6de-c033b41729ba-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.979856 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47365d7f-1974-40d3-b6de-c033b41729ba-config-data-generated\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.979928 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47365d7f-1974-40d3-b6de-c033b41729ba-config-data-default\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.979972 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47365d7f-1974-40d3-b6de-c033b41729ba-kolla-config\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.979997 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrgr\" (UniqueName: \"kubernetes.io/projected/47365d7f-1974-40d3-b6de-c033b41729ba-kube-api-access-mvrgr\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.980020 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.980059 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47365d7f-1974-40d3-b6de-c033b41729ba-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.980076 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47365d7f-1974-40d3-b6de-c033b41729ba-operator-scripts\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.980906 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.982603 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47365d7f-1974-40d3-b6de-c033b41729ba-kolla-config\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.982820 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47365d7f-1974-40d3-b6de-c033b41729ba-config-data-default\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.984651 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47365d7f-1974-40d3-b6de-c033b41729ba-operator-scripts\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:01 crc kubenswrapper[4475]: I1203 06:58:01.994890 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47365d7f-1974-40d3-b6de-c033b41729ba-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:02 crc kubenswrapper[4475]: I1203 06:58:02.010733 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47365d7f-1974-40d3-b6de-c033b41729ba-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:02 crc kubenswrapper[4475]: I1203 06:58:02.015639 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47365d7f-1974-40d3-b6de-c033b41729ba-config-data-generated\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:02 crc kubenswrapper[4475]: I1203 06:58:02.019557 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrgr\" (UniqueName: \"kubernetes.io/projected/47365d7f-1974-40d3-b6de-c033b41729ba-kube-api-access-mvrgr\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:02 crc kubenswrapper[4475]: I1203 06:58:02.029175 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"47365d7f-1974-40d3-b6de-c033b41729ba\") " pod="openstack/openstack-galera-0" Dec 03 06:58:02 crc kubenswrapper[4475]: I1203 06:58:02.036080 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 06:58:02 crc kubenswrapper[4475]: I1203 06:58:02.358349 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 06:58:02 crc kubenswrapper[4475]: W1203 06:58:02.394575 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47365d7f_1974_40d3_b6de_c033b41729ba.slice/crio-5985704fe0a7c16d82afb6894154dd0459e7f4d1ce41cb216b651a03e9903f80 WatchSource:0}: Error finding container 5985704fe0a7c16d82afb6894154dd0459e7f4d1ce41cb216b651a03e9903f80: Status 404 returned error can't find the container with id 5985704fe0a7c16d82afb6894154dd0459e7f4d1ce41cb216b651a03e9903f80 Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.060150 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.061804 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.066539 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.068942 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-m2pgk" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.069245 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.069354 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.070176 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.202875 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.202961 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf991fb-6470-4403-bf72-660c8c5ce811-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.203018 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aaf991fb-6470-4403-bf72-660c8c5ce811-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.203041 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaf991fb-6470-4403-bf72-660c8c5ce811-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.203160 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df2xl\" (UniqueName: \"kubernetes.io/projected/aaf991fb-6470-4403-bf72-660c8c5ce811-kube-api-access-df2xl\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.203228 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aaf991fb-6470-4403-bf72-660c8c5ce811-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.203279 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf991fb-6470-4403-bf72-660c8c5ce811-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.203324 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aaf991fb-6470-4403-bf72-660c8c5ce811-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.304736 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aaf991fb-6470-4403-bf72-660c8c5ce811-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.304806 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf991fb-6470-4403-bf72-660c8c5ce811-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.304830 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aaf991fb-6470-4403-bf72-660c8c5ce811-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.304880 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.304897 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf991fb-6470-4403-bf72-660c8c5ce811-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.304923 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aaf991fb-6470-4403-bf72-660c8c5ce811-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.304945 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaf991fb-6470-4403-bf72-660c8c5ce811-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.304982 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df2xl\" (UniqueName: \"kubernetes.io/projected/aaf991fb-6470-4403-bf72-660c8c5ce811-kube-api-access-df2xl\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.305810 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.305861 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aaf991fb-6470-4403-bf72-660c8c5ce811-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.305848 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aaf991fb-6470-4403-bf72-660c8c5ce811-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.306845 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaf991fb-6470-4403-bf72-660c8c5ce811-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.307029 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aaf991fb-6470-4403-bf72-660c8c5ce811-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.318690 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf991fb-6470-4403-bf72-660c8c5ce811-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.319741 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf991fb-6470-4403-bf72-660c8c5ce811-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.325596 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df2xl\" (UniqueName: \"kubernetes.io/projected/aaf991fb-6470-4403-bf72-660c8c5ce811-kube-api-access-df2xl\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.357718 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"aaf991fb-6470-4403-bf72-660c8c5ce811\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.383719 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.412696 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.413547 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.419910 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-w2r9w" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.420270 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.420401 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.432664 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.462490 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"47365d7f-1974-40d3-b6de-c033b41729ba","Type":"ContainerStarted","Data":"5985704fe0a7c16d82afb6894154dd0459e7f4d1ce41cb216b651a03e9903f80"} Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.512031 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa7315d-844e-472d-b13d-9932aff2326f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.512139 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/baa7315d-844e-472d-b13d-9932aff2326f-config-data\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.512185 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/baa7315d-844e-472d-b13d-9932aff2326f-kolla-config\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.512232 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa7315d-844e-472d-b13d-9932aff2326f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.512252 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsxnq\" (UniqueName: \"kubernetes.io/projected/baa7315d-844e-472d-b13d-9932aff2326f-kube-api-access-wsxnq\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.614567 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa7315d-844e-472d-b13d-9932aff2326f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.614625 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxnq\" (UniqueName: \"kubernetes.io/projected/baa7315d-844e-472d-b13d-9932aff2326f-kube-api-access-wsxnq\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.614741 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa7315d-844e-472d-b13d-9932aff2326f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.614762 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/baa7315d-844e-472d-b13d-9932aff2326f-config-data\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.614777 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/baa7315d-844e-472d-b13d-9932aff2326f-kolla-config\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.615436 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/baa7315d-844e-472d-b13d-9932aff2326f-kolla-config\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.616631 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/baa7315d-844e-472d-b13d-9932aff2326f-config-data\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.628929 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa7315d-844e-472d-b13d-9932aff2326f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.632804 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa7315d-844e-472d-b13d-9932aff2326f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.634093 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxnq\" (UniqueName: \"kubernetes.io/projected/baa7315d-844e-472d-b13d-9932aff2326f-kube-api-access-wsxnq\") pod \"memcached-0\" (UID: \"baa7315d-844e-472d-b13d-9932aff2326f\") " pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.752118 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 06:58:03 crc kubenswrapper[4475]: I1203 06:58:03.921743 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 06:58:03 crc kubenswrapper[4475]: W1203 06:58:03.993980 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaf991fb_6470_4403_bf72_660c8c5ce811.slice/crio-fdfddc03c17d344f152e7261d6f495248b80d7f7f0ae0af3c2aedb53712672ae WatchSource:0}: Error finding container fdfddc03c17d344f152e7261d6f495248b80d7f7f0ae0af3c2aedb53712672ae: Status 404 returned error can't find the container with id fdfddc03c17d344f152e7261d6f495248b80d7f7f0ae0af3c2aedb53712672ae Dec 03 06:58:04 crc kubenswrapper[4475]: I1203 06:58:04.243965 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 06:58:04 crc kubenswrapper[4475]: W1203 06:58:04.256020 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaa7315d_844e_472d_b13d_9932aff2326f.slice/crio-0cea872f3639352116f795e406ce3bde16f009a9aaf89fa6020b7b030878079d WatchSource:0}: Error finding container 0cea872f3639352116f795e406ce3bde16f009a9aaf89fa6020b7b030878079d: Status 404 returned error can't find the container with id 0cea872f3639352116f795e406ce3bde16f009a9aaf89fa6020b7b030878079d Dec 03 06:58:04 crc kubenswrapper[4475]: I1203 06:58:04.498934 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"aaf991fb-6470-4403-bf72-660c8c5ce811","Type":"ContainerStarted","Data":"fdfddc03c17d344f152e7261d6f495248b80d7f7f0ae0af3c2aedb53712672ae"} Dec 03 06:58:04 crc kubenswrapper[4475]: I1203 06:58:04.507871 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"baa7315d-844e-472d-b13d-9932aff2326f","Type":"ContainerStarted","Data":"0cea872f3639352116f795e406ce3bde16f009a9aaf89fa6020b7b030878079d"} Dec 03 06:58:05 crc kubenswrapper[4475]: I1203 06:58:05.186028 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:58:05 crc kubenswrapper[4475]: I1203 06:58:05.186950 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 06:58:05 crc kubenswrapper[4475]: I1203 06:58:05.192410 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2jdbx" Dec 03 06:58:05 crc kubenswrapper[4475]: I1203 06:58:05.211211 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:58:05 crc kubenswrapper[4475]: I1203 06:58:05.365275 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kngg9\" (UniqueName: \"kubernetes.io/projected/fd232cd1-6aca-43cc-9876-535cd9eb39eb-kube-api-access-kngg9\") pod \"kube-state-metrics-0\" (UID: \"fd232cd1-6aca-43cc-9876-535cd9eb39eb\") " pod="openstack/kube-state-metrics-0" Dec 03 06:58:05 crc kubenswrapper[4475]: I1203 06:58:05.468444 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kngg9\" (UniqueName: \"kubernetes.io/projected/fd232cd1-6aca-43cc-9876-535cd9eb39eb-kube-api-access-kngg9\") pod \"kube-state-metrics-0\" (UID: \"fd232cd1-6aca-43cc-9876-535cd9eb39eb\") " pod="openstack/kube-state-metrics-0" Dec 03 06:58:05 crc kubenswrapper[4475]: I1203 06:58:05.495773 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kngg9\" (UniqueName: \"kubernetes.io/projected/fd232cd1-6aca-43cc-9876-535cd9eb39eb-kube-api-access-kngg9\") pod \"kube-state-metrics-0\" (UID: \"fd232cd1-6aca-43cc-9876-535cd9eb39eb\") " pod="openstack/kube-state-metrics-0" Dec 03 06:58:05 crc kubenswrapper[4475]: I1203 06:58:05.513578 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 06:58:06 crc kubenswrapper[4475]: I1203 06:58:06.113918 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:58:06 crc kubenswrapper[4475]: I1203 06:58:06.579363 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fd232cd1-6aca-43cc-9876-535cd9eb39eb","Type":"ContainerStarted","Data":"b90f6eef308feb8fb35c00508576513c29309ca5e6fd44ca1e369659d34d1726"} Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.644737 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hcblc"] Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.649019 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.652733 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-m56vq" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.666727 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.666968 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.675279 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hcblc"] Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.719951 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-blpfh"] Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.721164 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-blpfh"] Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.721243 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.743624 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/462c6048-51ec-46dd-8eda-64398e53ce5b-var-run-ovn\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.744949 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/462c6048-51ec-46dd-8eda-64398e53ce5b-var-log-ovn\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.745057 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/462c6048-51ec-46dd-8eda-64398e53ce5b-scripts\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.745093 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/462c6048-51ec-46dd-8eda-64398e53ce5b-var-run\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.745114 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462c6048-51ec-46dd-8eda-64398e53ce5b-combined-ca-bundle\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.745138 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cpv5\" (UniqueName: \"kubernetes.io/projected/462c6048-51ec-46dd-8eda-64398e53ce5b-kube-api-access-2cpv5\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.745187 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/462c6048-51ec-46dd-8eda-64398e53ce5b-ovn-controller-tls-certs\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847101 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/462c6048-51ec-46dd-8eda-64398e53ce5b-scripts\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847141 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ae131a1-4eba-47eb-9904-0906e5be196a-scripts\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847169 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-var-run\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847201 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/462c6048-51ec-46dd-8eda-64398e53ce5b-var-run\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847226 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462c6048-51ec-46dd-8eda-64398e53ce5b-combined-ca-bundle\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847246 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cpv5\" (UniqueName: \"kubernetes.io/projected/462c6048-51ec-46dd-8eda-64398e53ce5b-kube-api-access-2cpv5\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847263 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-etc-ovs\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847283 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-var-lib\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847310 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/462c6048-51ec-46dd-8eda-64398e53ce5b-ovn-controller-tls-certs\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847332 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vrs5\" (UniqueName: \"kubernetes.io/projected/9ae131a1-4eba-47eb-9904-0906e5be196a-kube-api-access-2vrs5\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847354 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/462c6048-51ec-46dd-8eda-64398e53ce5b-var-run-ovn\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847376 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/462c6048-51ec-46dd-8eda-64398e53ce5b-var-log-ovn\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.847395 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-var-log\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.848128 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/462c6048-51ec-46dd-8eda-64398e53ce5b-var-log-ovn\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.848156 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/462c6048-51ec-46dd-8eda-64398e53ce5b-var-run\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.849072 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/462c6048-51ec-46dd-8eda-64398e53ce5b-var-run-ovn\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.849981 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/462c6048-51ec-46dd-8eda-64398e53ce5b-scripts\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.861401 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cpv5\" (UniqueName: \"kubernetes.io/projected/462c6048-51ec-46dd-8eda-64398e53ce5b-kube-api-access-2cpv5\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.867438 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/462c6048-51ec-46dd-8eda-64398e53ce5b-ovn-controller-tls-certs\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.873222 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462c6048-51ec-46dd-8eda-64398e53ce5b-combined-ca-bundle\") pod \"ovn-controller-hcblc\" (UID: \"462c6048-51ec-46dd-8eda-64398e53ce5b\") " pod="openstack/ovn-controller-hcblc" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.949124 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ae131a1-4eba-47eb-9904-0906e5be196a-scripts\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.949174 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-var-run\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.949214 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-etc-ovs\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.949232 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-var-lib\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.949267 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vrs5\" (UniqueName: \"kubernetes.io/projected/9ae131a1-4eba-47eb-9904-0906e5be196a-kube-api-access-2vrs5\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.949300 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-var-log\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.949502 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-var-log\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.950667 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-var-run\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.950712 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-var-lib\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.950712 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9ae131a1-4eba-47eb-9904-0906e5be196a-etc-ovs\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.951654 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ae131a1-4eba-47eb-9904-0906e5be196a-scripts\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.966241 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vrs5\" (UniqueName: \"kubernetes.io/projected/9ae131a1-4eba-47eb-9904-0906e5be196a-kube-api-access-2vrs5\") pod \"ovn-controller-ovs-blpfh\" (UID: \"9ae131a1-4eba-47eb-9904-0906e5be196a\") " pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:08 crc kubenswrapper[4475]: I1203 06:58:08.986947 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hcblc" Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.032969 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.543778 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hcblc"] Dec 03 06:58:09 crc kubenswrapper[4475]: W1203 06:58:09.593481 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod462c6048_51ec_46dd_8eda_64398e53ce5b.slice/crio-16258fed8b062eb08136ba1594e58db0b9f916303d83783699a2b637b6a87584 WatchSource:0}: Error finding container 16258fed8b062eb08136ba1594e58db0b9f916303d83783699a2b637b6a87584: Status 404 returned error can't find the container with id 16258fed8b062eb08136ba1594e58db0b9f916303d83783699a2b637b6a87584 Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.682498 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hcblc" event={"ID":"462c6048-51ec-46dd-8eda-64398e53ce5b","Type":"ContainerStarted","Data":"16258fed8b062eb08136ba1594e58db0b9f916303d83783699a2b637b6a87584"} Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.685428 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fd232cd1-6aca-43cc-9876-535cd9eb39eb","Type":"ContainerStarted","Data":"27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b"} Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.686026 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.713302 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.765117507 podStartE2EDuration="4.713288458s" podCreationTimestamp="2025-12-03 06:58:05 +0000 UTC" firstStartedPulling="2025-12-03 06:58:06.184444201 +0000 UTC m=+770.989342535" lastFinishedPulling="2025-12-03 06:58:09.132615153 +0000 UTC m=+773.937513486" observedRunningTime="2025-12-03 06:58:09.708242897 +0000 UTC m=+774.513141231" watchObservedRunningTime="2025-12-03 06:58:09.713288458 +0000 UTC m=+774.518186792" Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.896595 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-blpfh"] Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.943549 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-df9n4"] Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.950345 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.954144 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.954279 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 06:58:09 crc kubenswrapper[4475]: I1203 06:58:09.981851 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-df9n4"] Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.082391 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpctq\" (UniqueName: \"kubernetes.io/projected/bc0d464b-1908-45ac-8ee3-2d470c68deda-kube-api-access-dpctq\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.082555 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0d464b-1908-45ac-8ee3-2d470c68deda-config\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.082586 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc0d464b-1908-45ac-8ee3-2d470c68deda-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.082607 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bc0d464b-1908-45ac-8ee3-2d470c68deda-ovn-rundir\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.082624 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bc0d464b-1908-45ac-8ee3-2d470c68deda-ovs-rundir\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.082726 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0d464b-1908-45ac-8ee3-2d470c68deda-combined-ca-bundle\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.184543 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpctq\" (UniqueName: \"kubernetes.io/projected/bc0d464b-1908-45ac-8ee3-2d470c68deda-kube-api-access-dpctq\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.184643 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0d464b-1908-45ac-8ee3-2d470c68deda-config\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.184665 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc0d464b-1908-45ac-8ee3-2d470c68deda-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.184684 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bc0d464b-1908-45ac-8ee3-2d470c68deda-ovn-rundir\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.184702 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bc0d464b-1908-45ac-8ee3-2d470c68deda-ovs-rundir\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.185108 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bc0d464b-1908-45ac-8ee3-2d470c68deda-ovn-rundir\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.185260 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bc0d464b-1908-45ac-8ee3-2d470c68deda-ovs-rundir\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.185321 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0d464b-1908-45ac-8ee3-2d470c68deda-config\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.185409 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0d464b-1908-45ac-8ee3-2d470c68deda-combined-ca-bundle\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.190186 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0d464b-1908-45ac-8ee3-2d470c68deda-combined-ca-bundle\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.190537 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc0d464b-1908-45ac-8ee3-2d470c68deda-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.197750 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpctq\" (UniqueName: \"kubernetes.io/projected/bc0d464b-1908-45ac-8ee3-2d470c68deda-kube-api-access-dpctq\") pod \"ovn-controller-metrics-df9n4\" (UID: \"bc0d464b-1908-45ac-8ee3-2d470c68deda\") " pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.274362 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-df9n4" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.695338 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-df9n4"] Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.700364 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-blpfh" event={"ID":"9ae131a1-4eba-47eb-9904-0906e5be196a","Type":"ContainerStarted","Data":"65cee51477e1faa348427100b4dad2ec95e1fd785b6bed8f76c13d79c7f5c769"} Dec 03 06:58:10 crc kubenswrapper[4475]: W1203 06:58:10.712676 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc0d464b_1908_45ac_8ee3_2d470c68deda.slice/crio-e7d6d0c90e3e799c7b5225e4b00455cb641bff94d8ed1c84ea255a925e7d2e7b WatchSource:0}: Error finding container e7d6d0c90e3e799c7b5225e4b00455cb641bff94d8ed1c84ea255a925e7d2e7b: Status 404 returned error can't find the container with id e7d6d0c90e3e799c7b5225e4b00455cb641bff94d8ed1c84ea255a925e7d2e7b Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.832988 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.836258 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.836901 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.840631 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.841162 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.841364 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-s5rn7" Dec 03 06:58:10 crc kubenswrapper[4475]: I1203 06:58:10.855569 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.000016 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9919f7e5-42de-45e9-b403-8e5394d971ad-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.000062 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9919f7e5-42de-45e9-b403-8e5394d971ad-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.000095 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbmnt\" (UniqueName: \"kubernetes.io/projected/9919f7e5-42de-45e9-b403-8e5394d971ad-kube-api-access-lbmnt\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.000146 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9919f7e5-42de-45e9-b403-8e5394d971ad-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.000173 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9919f7e5-42de-45e9-b403-8e5394d971ad-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.000199 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9919f7e5-42de-45e9-b403-8e5394d971ad-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.000214 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.000236 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9919f7e5-42de-45e9-b403-8e5394d971ad-config\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.101756 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9919f7e5-42de-45e9-b403-8e5394d971ad-config\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.101808 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9919f7e5-42de-45e9-b403-8e5394d971ad-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.101832 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9919f7e5-42de-45e9-b403-8e5394d971ad-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.101869 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbmnt\" (UniqueName: \"kubernetes.io/projected/9919f7e5-42de-45e9-b403-8e5394d971ad-kube-api-access-lbmnt\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.101919 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9919f7e5-42de-45e9-b403-8e5394d971ad-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.101944 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9919f7e5-42de-45e9-b403-8e5394d971ad-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.101970 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9919f7e5-42de-45e9-b403-8e5394d971ad-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.101984 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.103274 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9919f7e5-42de-45e9-b403-8e5394d971ad-config\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.103908 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9919f7e5-42de-45e9-b403-8e5394d971ad-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.104553 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9919f7e5-42de-45e9-b403-8e5394d971ad-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.104609 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.109848 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9919f7e5-42de-45e9-b403-8e5394d971ad-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.112284 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9919f7e5-42de-45e9-b403-8e5394d971ad-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.125779 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9919f7e5-42de-45e9-b403-8e5394d971ad-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.126829 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbmnt\" (UniqueName: \"kubernetes.io/projected/9919f7e5-42de-45e9-b403-8e5394d971ad-kube-api-access-lbmnt\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.140932 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9919f7e5-42de-45e9-b403-8e5394d971ad\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.216713 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.715014 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-df9n4" event={"ID":"bc0d464b-1908-45ac-8ee3-2d470c68deda","Type":"ContainerStarted","Data":"e7d6d0c90e3e799c7b5225e4b00455cb641bff94d8ed1c84ea255a925e7d2e7b"} Dec 03 06:58:11 crc kubenswrapper[4475]: I1203 06:58:11.761126 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 06:58:11 crc kubenswrapper[4475]: W1203 06:58:11.769664 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9919f7e5_42de_45e9_b403_8e5394d971ad.slice/crio-197c2c7da3b4f770418145389392778ab88e90635fa49ee8e1f244040d1bdadb WatchSource:0}: Error finding container 197c2c7da3b4f770418145389392778ab88e90635fa49ee8e1f244040d1bdadb: Status 404 returned error can't find the container with id 197c2c7da3b4f770418145389392778ab88e90635fa49ee8e1f244040d1bdadb Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.080956 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.082025 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.089469 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6fw7f" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.089640 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.089758 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.091759 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.097728 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.231662 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.231705 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2526f69f-3f5a-4155-bd03-96c2b2c7035b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.231752 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2526f69f-3f5a-4155-bd03-96c2b2c7035b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.231811 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2526f69f-3f5a-4155-bd03-96c2b2c7035b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.231835 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2526f69f-3f5a-4155-bd03-96c2b2c7035b-config\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.231860 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2526f69f-3f5a-4155-bd03-96c2b2c7035b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.231972 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kvtl\" (UniqueName: \"kubernetes.io/projected/2526f69f-3f5a-4155-bd03-96c2b2c7035b-kube-api-access-2kvtl\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.232033 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2526f69f-3f5a-4155-bd03-96c2b2c7035b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.333779 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2526f69f-3f5a-4155-bd03-96c2b2c7035b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.333851 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2526f69f-3f5a-4155-bd03-96c2b2c7035b-config\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.333871 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2526f69f-3f5a-4155-bd03-96c2b2c7035b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.333903 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kvtl\" (UniqueName: \"kubernetes.io/projected/2526f69f-3f5a-4155-bd03-96c2b2c7035b-kube-api-access-2kvtl\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.333933 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2526f69f-3f5a-4155-bd03-96c2b2c7035b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.333960 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.333984 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2526f69f-3f5a-4155-bd03-96c2b2c7035b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.334047 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2526f69f-3f5a-4155-bd03-96c2b2c7035b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.334504 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2526f69f-3f5a-4155-bd03-96c2b2c7035b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.334551 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.334810 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2526f69f-3f5a-4155-bd03-96c2b2c7035b-config\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.335273 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2526f69f-3f5a-4155-bd03-96c2b2c7035b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.341762 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2526f69f-3f5a-4155-bd03-96c2b2c7035b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.342480 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2526f69f-3f5a-4155-bd03-96c2b2c7035b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.349428 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kvtl\" (UniqueName: \"kubernetes.io/projected/2526f69f-3f5a-4155-bd03-96c2b2c7035b-kube-api-access-2kvtl\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.357506 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2526f69f-3f5a-4155-bd03-96c2b2c7035b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.359603 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2526f69f-3f5a-4155-bd03-96c2b2c7035b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.430676 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.726548 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9919f7e5-42de-45e9-b403-8e5394d971ad","Type":"ContainerStarted","Data":"197c2c7da3b4f770418145389392778ab88e90635fa49ee8e1f244040d1bdadb"} Dec 03 06:58:12 crc kubenswrapper[4475]: I1203 06:58:12.923825 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 06:58:12 crc kubenswrapper[4475]: W1203 06:58:12.932007 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2526f69f_3f5a_4155_bd03_96c2b2c7035b.slice/crio-1d073df42a25619fd36c801d505be030ece9b8b56b3e109a39392c3c8825c26e WatchSource:0}: Error finding container 1d073df42a25619fd36c801d505be030ece9b8b56b3e109a39392c3c8825c26e: Status 404 returned error can't find the container with id 1d073df42a25619fd36c801d505be030ece9b8b56b3e109a39392c3c8825c26e Dec 03 06:58:13 crc kubenswrapper[4475]: I1203 06:58:13.748426 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2526f69f-3f5a-4155-bd03-96c2b2c7035b","Type":"ContainerStarted","Data":"1d073df42a25619fd36c801d505be030ece9b8b56b3e109a39392c3c8825c26e"} Dec 03 06:58:15 crc kubenswrapper[4475]: I1203 06:58:15.525901 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 06:58:15 crc kubenswrapper[4475]: I1203 06:58:15.762638 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-df9n4" event={"ID":"bc0d464b-1908-45ac-8ee3-2d470c68deda","Type":"ContainerStarted","Data":"a210992140ad06684df2f448f1f9cb7d6b81fa217bd050b91d221a3aabf3ab0a"} Dec 03 06:58:15 crc kubenswrapper[4475]: I1203 06:58:15.775418 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-df9n4" podStartSLOduration=2.468498336 podStartE2EDuration="6.775407827s" podCreationTimestamp="2025-12-03 06:58:09 +0000 UTC" firstStartedPulling="2025-12-03 06:58:10.725244488 +0000 UTC m=+775.530142822" lastFinishedPulling="2025-12-03 06:58:15.032153978 +0000 UTC m=+779.837052313" observedRunningTime="2025-12-03 06:58:15.774942111 +0000 UTC m=+780.579840445" watchObservedRunningTime="2025-12-03 06:58:15.775407827 +0000 UTC m=+780.580306161" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.095974 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55746cbb65-vl4j9"] Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.100086 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-844556c789-r5js2"] Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.101124 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.102643 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.113293 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-dns-svc\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.113339 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkv4k\" (UniqueName: \"kubernetes.io/projected/48f10a7f-1ae5-49e6-924c-8de4068edaea-kube-api-access-kkv4k\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.113358 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-config\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.113400 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-ovsdbserver-sb\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.127966 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844556c789-r5js2"] Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.222281 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-ovsdbserver-sb\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.222406 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-dns-svc\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.222746 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkv4k\" (UniqueName: \"kubernetes.io/projected/48f10a7f-1ae5-49e6-924c-8de4068edaea-kube-api-access-kkv4k\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.222774 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-config\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.229511 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-dns-svc\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.229892 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-ovsdbserver-sb\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.230036 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-config\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.247346 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkv4k\" (UniqueName: \"kubernetes.io/projected/48f10a7f-1ae5-49e6-924c-8de4068edaea-kube-api-access-kkv4k\") pod \"dnsmasq-dns-844556c789-r5js2\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.252551 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59654bbc49-vdxqb"] Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.287051 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5849c4cb99-2bndp"] Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.290776 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.298544 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.315042 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5849c4cb99-2bndp"] Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.427408 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-dns-svc\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.427482 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-nb\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.428011 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-config\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.428092 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-sb\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.428140 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwqxb\" (UniqueName: \"kubernetes.io/projected/da48be59-9479-477c-a208-d623edd61159-kube-api-access-gwqxb\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.453230 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.529678 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-dns-svc\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.530368 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-nb\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.530426 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-config\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.530488 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-sb\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.530538 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwqxb\" (UniqueName: \"kubernetes.io/projected/da48be59-9479-477c-a208-d623edd61159-kube-api-access-gwqxb\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.531635 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-nb\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.531675 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-sb\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.531962 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-dns-svc\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.532579 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-config\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.554078 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwqxb\" (UniqueName: \"kubernetes.io/projected/da48be59-9479-477c-a208-d623edd61159-kube-api-access-gwqxb\") pod \"dnsmasq-dns-5849c4cb99-2bndp\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.606879 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:16 crc kubenswrapper[4475]: I1203 06:58:16.897179 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844556c789-r5js2"] Dec 03 06:58:17 crc kubenswrapper[4475]: I1203 06:58:17.056775 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5849c4cb99-2bndp"] Dec 03 06:58:17 crc kubenswrapper[4475]: I1203 06:58:17.785436 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" event={"ID":"da48be59-9479-477c-a208-d623edd61159","Type":"ContainerStarted","Data":"d04c4d1ac4c3c94a525f3e9e5a763892a63b40e7f9ec7b5aa8f67a44b7ee7ff6"} Dec 03 06:58:17 crc kubenswrapper[4475]: I1203 06:58:17.786680 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844556c789-r5js2" event={"ID":"48f10a7f-1ae5-49e6-924c-8de4068edaea","Type":"ContainerStarted","Data":"4e9f0643951779d43743a52b51f6f1d806e97a9e1169844c23fa70a5d5395f45"} Dec 03 06:58:28 crc kubenswrapper[4475]: I1203 06:58:28.933252 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:58:28 crc kubenswrapper[4475]: I1203 06:58:28.933711 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:58:42 crc kubenswrapper[4475]: E1203 06:58:42.339390 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605" Dec 03 06:58:42 crc kubenswrapper[4475]: E1203 06:58:42.339739 4475 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605" Dec 03 06:58:42 crc kubenswrapper[4475]: E1203 06:58:42.339848 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xhfhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6569bcd497-9vjn4_openstack(194314aa-621d-43ab-94ba-62ba52e1208a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:58:42 crc kubenswrapper[4475]: E1203 06:58:42.341905 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" podUID="194314aa-621d-43ab-94ba-62ba52e1208a" Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.218945 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.322885 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/194314aa-621d-43ab-94ba-62ba52e1208a-config\") pod \"194314aa-621d-43ab-94ba-62ba52e1208a\" (UID: \"194314aa-621d-43ab-94ba-62ba52e1208a\") " Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.322993 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhfhk\" (UniqueName: \"kubernetes.io/projected/194314aa-621d-43ab-94ba-62ba52e1208a-kube-api-access-xhfhk\") pod \"194314aa-621d-43ab-94ba-62ba52e1208a\" (UID: \"194314aa-621d-43ab-94ba-62ba52e1208a\") " Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.323770 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/194314aa-621d-43ab-94ba-62ba52e1208a-config" (OuterVolumeSpecName: "config") pod "194314aa-621d-43ab-94ba-62ba52e1208a" (UID: "194314aa-621d-43ab-94ba-62ba52e1208a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.327484 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/194314aa-621d-43ab-94ba-62ba52e1208a-kube-api-access-xhfhk" (OuterVolumeSpecName: "kube-api-access-xhfhk") pod "194314aa-621d-43ab-94ba-62ba52e1208a" (UID: "194314aa-621d-43ab-94ba-62ba52e1208a"). InnerVolumeSpecName "kube-api-access-xhfhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.424557 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/194314aa-621d-43ab-94ba-62ba52e1208a-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.424580 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhfhk\" (UniqueName: \"kubernetes.io/projected/194314aa-621d-43ab-94ba-62ba52e1208a-kube-api-access-xhfhk\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.965683 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"386645cd-74e5-45bc-b3e4-0a326e5349f1","Type":"ContainerStarted","Data":"ee598506c3416b4517e59d2ed9d555835bc5aaa108285dc206bf5b914fcfd214"} Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.967190 4475 generic.go:334] "Generic (PLEG): container finished" podID="da48be59-9479-477c-a208-d623edd61159" containerID="060804bad0b84efc5b1c52bc0c58a60ce6de4e2fdb49d2833ec582a384108738" exitCode=0 Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.967255 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" event={"ID":"da48be59-9479-477c-a208-d623edd61159","Type":"ContainerDied","Data":"060804bad0b84efc5b1c52bc0c58a60ce6de4e2fdb49d2833ec582a384108738"} Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.969381 4475 generic.go:334] "Generic (PLEG): container finished" podID="8b92cc5c-e27e-49bc-949c-7c98a208180a" containerID="311499b56eafb2d2ed0397d25a32129d4688354e2da653433aa1a4176a143983" exitCode=0 Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.969499 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" event={"ID":"8b92cc5c-e27e-49bc-949c-7c98a208180a","Type":"ContainerDied","Data":"311499b56eafb2d2ed0397d25a32129d4688354e2da653433aa1a4176a143983"} Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.973304 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" event={"ID":"194314aa-621d-43ab-94ba-62ba52e1208a","Type":"ContainerDied","Data":"5ed97d6057b9f8842495000012233e4b113eb39e8945fb05e0abfd660d6079f5"} Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.973363 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6569bcd497-9vjn4" Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.974998 4475 generic.go:334] "Generic (PLEG): container finished" podID="9ae131a1-4eba-47eb-9904-0906e5be196a" containerID="244eb7289a2a6b7d52533613f6149c81c18f06026f9664e76e6535006f5d4050" exitCode=0 Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.975023 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-blpfh" event={"ID":"9ae131a1-4eba-47eb-9904-0906e5be196a","Type":"ContainerDied","Data":"244eb7289a2a6b7d52533613f6149c81c18f06026f9664e76e6535006f5d4050"} Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.977852 4475 generic.go:334] "Generic (PLEG): container finished" podID="48f10a7f-1ae5-49e6-924c-8de4068edaea" containerID="47b000fe48e1c3f0bd782d4e4113dcc8e7e96a66e661ca7daea6a793a0595224" exitCode=0 Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.977905 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844556c789-r5js2" event={"ID":"48f10a7f-1ae5-49e6-924c-8de4068edaea","Type":"ContainerDied","Data":"47b000fe48e1c3f0bd782d4e4113dcc8e7e96a66e661ca7daea6a793a0595224"} Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.979830 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2526f69f-3f5a-4155-bd03-96c2b2c7035b","Type":"ContainerStarted","Data":"2a78d0d15fcc838f23c0f135dcfcbee7449f4d51a9e1cd85100bf779e9b85552"} Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.984293 4475 generic.go:334] "Generic (PLEG): container finished" podID="d57229df-2d5e-482f-bdeb-ae9b0a04b0ec" containerID="370a7a36185c224b55c42ad332bb5d5e409daa568cec9f39a31d0d48d1cecda4" exitCode=0 Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.984348 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" event={"ID":"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec","Type":"ContainerDied","Data":"370a7a36185c224b55c42ad332bb5d5e409daa568cec9f39a31d0d48d1cecda4"} Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.991255 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hcblc" event={"ID":"462c6048-51ec-46dd-8eda-64398e53ce5b","Type":"ContainerStarted","Data":"41695ff21d5a1341699250e4ca63b05de5d0e14cce03740113045b56fb7b9b94"} Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.991946 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-hcblc" Dec 03 06:58:43 crc kubenswrapper[4475]: I1203 06:58:43.999905 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"47365d7f-1974-40d3-b6de-c033b41729ba","Type":"ContainerStarted","Data":"eef4b31bf419efb35a35eedadf8f31a03c38b8827fc87ccfb43007884c35e337"} Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.018259 4475 generic.go:334] "Generic (PLEG): container finished" podID="4cd4f066-5f36-4e37-963d-8aa66bf9267c" containerID="465432222d40043598bfc697256851222fe5295f468c05ec3e16062c57464a92" exitCode=0 Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.018318 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" event={"ID":"4cd4f066-5f36-4e37-963d-8aa66bf9267c","Type":"ContainerDied","Data":"465432222d40043598bfc697256851222fe5295f468c05ec3e16062c57464a92"} Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.027801 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"aaf991fb-6470-4403-bf72-660c8c5ce811","Type":"ContainerStarted","Data":"acb1c5092be513f35fc2d1d699e94d61ecb78127aff60bc832fa5230690b6474"} Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.033194 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9919f7e5-42de-45e9-b403-8e5394d971ad","Type":"ContainerStarted","Data":"057676ccdf273c9f7804ba63ff39bdcb9df1eae4456f697bce11e26655481b5e"} Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.050750 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"baa7315d-844e-472d-b13d-9932aff2326f","Type":"ContainerStarted","Data":"394d880e1437c290d155bd7b59d2f378176f218fd9f59825ddc0e4cab76ceb1b"} Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.051042 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.124326 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.664113577 podStartE2EDuration="41.124310633s" podCreationTimestamp="2025-12-03 06:58:03 +0000 UTC" firstStartedPulling="2025-12-03 06:58:04.258495091 +0000 UTC m=+769.063393425" lastFinishedPulling="2025-12-03 06:58:42.718692147 +0000 UTC m=+807.523590481" observedRunningTime="2025-12-03 06:58:44.118005372 +0000 UTC m=+808.922903716" watchObservedRunningTime="2025-12-03 06:58:44.124310633 +0000 UTC m=+808.929208967" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.186784 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hcblc" podStartSLOduration=3.006402078 podStartE2EDuration="36.186759035s" podCreationTimestamp="2025-12-03 06:58:08 +0000 UTC" firstStartedPulling="2025-12-03 06:58:09.604810458 +0000 UTC m=+774.409708792" lastFinishedPulling="2025-12-03 06:58:42.785167415 +0000 UTC m=+807.590065749" observedRunningTime="2025-12-03 06:58:44.184859383 +0000 UTC m=+808.989757717" watchObservedRunningTime="2025-12-03 06:58:44.186759035 +0000 UTC m=+808.991657370" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.248795 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6569bcd497-9vjn4"] Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.260625 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6569bcd497-9vjn4"] Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.394396 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.422600 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.435202 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.543021 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-dns-svc\") pod \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.543320 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-config\") pod \"8b92cc5c-e27e-49bc-949c-7c98a208180a\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.543864 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-dns-svc\") pod \"8b92cc5c-e27e-49bc-949c-7c98a208180a\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.543918 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-config\") pod \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.543976 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25968\" (UniqueName: \"kubernetes.io/projected/8b92cc5c-e27e-49bc-949c-7c98a208180a-kube-api-access-25968\") pod \"8b92cc5c-e27e-49bc-949c-7c98a208180a\" (UID: \"8b92cc5c-e27e-49bc-949c-7c98a208180a\") " Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.543996 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9whs9\" (UniqueName: \"kubernetes.io/projected/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-kube-api-access-9whs9\") pod \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.544031 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-config\") pod \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.544073 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-dns-svc\") pod \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\" (UID: \"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec\") " Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.544125 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjdm9\" (UniqueName: \"kubernetes.io/projected/4cd4f066-5f36-4e37-963d-8aa66bf9267c-kube-api-access-tjdm9\") pod \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\" (UID: \"4cd4f066-5f36-4e37-963d-8aa66bf9267c\") " Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.548481 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b92cc5c-e27e-49bc-949c-7c98a208180a-kube-api-access-25968" (OuterVolumeSpecName: "kube-api-access-25968") pod "8b92cc5c-e27e-49bc-949c-7c98a208180a" (UID: "8b92cc5c-e27e-49bc-949c-7c98a208180a"). InnerVolumeSpecName "kube-api-access-25968". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.558237 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-kube-api-access-9whs9" (OuterVolumeSpecName: "kube-api-access-9whs9") pod "d57229df-2d5e-482f-bdeb-ae9b0a04b0ec" (UID: "d57229df-2d5e-482f-bdeb-ae9b0a04b0ec"). InnerVolumeSpecName "kube-api-access-9whs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.559007 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25968\" (UniqueName: \"kubernetes.io/projected/8b92cc5c-e27e-49bc-949c-7c98a208180a-kube-api-access-25968\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.559025 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9whs9\" (UniqueName: \"kubernetes.io/projected/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-kube-api-access-9whs9\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.569793 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd4f066-5f36-4e37-963d-8aa66bf9267c-kube-api-access-tjdm9" (OuterVolumeSpecName: "kube-api-access-tjdm9") pod "4cd4f066-5f36-4e37-963d-8aa66bf9267c" (UID: "4cd4f066-5f36-4e37-963d-8aa66bf9267c"). InnerVolumeSpecName "kube-api-access-tjdm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.570535 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-config" (OuterVolumeSpecName: "config") pod "4cd4f066-5f36-4e37-963d-8aa66bf9267c" (UID: "4cd4f066-5f36-4e37-963d-8aa66bf9267c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.576763 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b92cc5c-e27e-49bc-949c-7c98a208180a" (UID: "8b92cc5c-e27e-49bc-949c-7c98a208180a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.577216 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d57229df-2d5e-482f-bdeb-ae9b0a04b0ec" (UID: "d57229df-2d5e-482f-bdeb-ae9b0a04b0ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.584233 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cd4f066-5f36-4e37-963d-8aa66bf9267c" (UID: "4cd4f066-5f36-4e37-963d-8aa66bf9267c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.592322 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-config" (OuterVolumeSpecName: "config") pod "8b92cc5c-e27e-49bc-949c-7c98a208180a" (UID: "8b92cc5c-e27e-49bc-949c-7c98a208180a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.598570 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-config" (OuterVolumeSpecName: "config") pod "d57229df-2d5e-482f-bdeb-ae9b0a04b0ec" (UID: "d57229df-2d5e-482f-bdeb-ae9b0a04b0ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.660271 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjdm9\" (UniqueName: \"kubernetes.io/projected/4cd4f066-5f36-4e37-963d-8aa66bf9267c-kube-api-access-tjdm9\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.660302 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.660311 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.660320 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b92cc5c-e27e-49bc-949c-7c98a208180a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.660328 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cd4f066-5f36-4e37-963d-8aa66bf9267c-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.660335 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:44 crc kubenswrapper[4475]: I1203 06:58:44.660368 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.060100 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" event={"ID":"8b92cc5c-e27e-49bc-949c-7c98a208180a","Type":"ContainerDied","Data":"6523189721d8bc441d5f0e45acf28e354164feaf541bd6086e7e1def16eb10e8"} Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.060332 4475 scope.go:117] "RemoveContainer" containerID="311499b56eafb2d2ed0397d25a32129d4688354e2da653433aa1a4176a143983" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.060262 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59654bbc49-vdxqb" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.062671 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" event={"ID":"d57229df-2d5e-482f-bdeb-ae9b0a04b0ec","Type":"ContainerDied","Data":"e495253001e00b5389debf1d2b73450c32c1fd3655c06e51808a58cc661116a2"} Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.062691 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d86d69cf7-ltbwj" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.064956 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" event={"ID":"da48be59-9479-477c-a208-d623edd61159","Type":"ContainerStarted","Data":"1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966"} Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.065649 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.073118 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6447be14-8b0d-4514-a7c2-53da228c70c2","Type":"ContainerStarted","Data":"814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d"} Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.074592 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844556c789-r5js2" event={"ID":"48f10a7f-1ae5-49e6-924c-8de4068edaea","Type":"ContainerStarted","Data":"c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e"} Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.074656 4475 scope.go:117] "RemoveContainer" containerID="370a7a36185c224b55c42ad332bb5d5e409daa568cec9f39a31d0d48d1cecda4" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.081718 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2526f69f-3f5a-4155-bd03-96c2b2c7035b","Type":"ContainerStarted","Data":"7ce1bb25e12027a6eb830e52aeec4395a2d010d4b61de7f44c8dc5ae856084a0"} Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.087018 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9919f7e5-42de-45e9-b403-8e5394d971ad","Type":"ContainerStarted","Data":"40d6d7f12295b0b38ad23f94cfae56a0d2b2634cc640330981e152bfc0083857"} Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.092022 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" podStartSLOduration=3.311250798 podStartE2EDuration="29.092013637s" podCreationTimestamp="2025-12-03 06:58:16 +0000 UTC" firstStartedPulling="2025-12-03 06:58:17.078413887 +0000 UTC m=+781.883312221" lastFinishedPulling="2025-12-03 06:58:42.859176726 +0000 UTC m=+807.664075060" observedRunningTime="2025-12-03 06:58:45.089365096 +0000 UTC m=+809.894263430" watchObservedRunningTime="2025-12-03 06:58:45.092013637 +0000 UTC m=+809.896911972" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.094603 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-blpfh" event={"ID":"9ae131a1-4eba-47eb-9904-0906e5be196a","Type":"ContainerStarted","Data":"78b71ec2e186c67db24f400da5b6a0a7440ee50617a376efa21ef89e493423d1"} Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.094637 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-blpfh" event={"ID":"9ae131a1-4eba-47eb-9904-0906e5be196a","Type":"ContainerStarted","Data":"4fd1086583841b784b03e65656735cd5902899ead807ada77fc7035ad84d3fc3"} Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.094705 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.094716 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.100889 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.100926 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55746cbb65-vl4j9" event={"ID":"4cd4f066-5f36-4e37-963d-8aa66bf9267c","Type":"ContainerDied","Data":"3fc5092e54690e5bbd984dda1c934641ce9ca2dcba9ed8226db4e08e3f012107"} Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.104371 4475 scope.go:117] "RemoveContainer" containerID="465432222d40043598bfc697256851222fe5295f468c05ec3e16062c57464a92" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.149399 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.274316045 podStartE2EDuration="34.149385722s" podCreationTimestamp="2025-12-03 06:58:11 +0000 UTC" firstStartedPulling="2025-12-03 06:58:12.936084983 +0000 UTC m=+777.740983317" lastFinishedPulling="2025-12-03 06:58:42.81115466 +0000 UTC m=+807.616052994" observedRunningTime="2025-12-03 06:58:45.144380066 +0000 UTC m=+809.949278419" watchObservedRunningTime="2025-12-03 06:58:45.149385722 +0000 UTC m=+809.954284056" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.179206 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-844556c789-r5js2" podStartSLOduration=3.257986513 podStartE2EDuration="29.179190538s" podCreationTimestamp="2025-12-03 06:58:16 +0000 UTC" firstStartedPulling="2025-12-03 06:58:16.907420618 +0000 UTC m=+781.712318952" lastFinishedPulling="2025-12-03 06:58:42.828624643 +0000 UTC m=+807.633522977" observedRunningTime="2025-12-03 06:58:45.165123746 +0000 UTC m=+809.970022080" watchObservedRunningTime="2025-12-03 06:58:45.179190538 +0000 UTC m=+809.984088871" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.179593 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.125227632 podStartE2EDuration="36.179587646s" podCreationTimestamp="2025-12-03 06:58:09 +0000 UTC" firstStartedPulling="2025-12-03 06:58:11.772598575 +0000 UTC m=+776.577496909" lastFinishedPulling="2025-12-03 06:58:42.82695859 +0000 UTC m=+807.631856923" observedRunningTime="2025-12-03 06:58:45.177168264 +0000 UTC m=+809.982066587" watchObservedRunningTime="2025-12-03 06:58:45.179587646 +0000 UTC m=+809.984485979" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.200141 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d86d69cf7-ltbwj"] Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.206041 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d86d69cf7-ltbwj"] Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.225129 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59654bbc49-vdxqb"] Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.228881 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59654bbc49-vdxqb"] Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.235392 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-blpfh" podStartSLOduration=4.378174535 podStartE2EDuration="37.235383924s" podCreationTimestamp="2025-12-03 06:58:08 +0000 UTC" firstStartedPulling="2025-12-03 06:58:09.917615672 +0000 UTC m=+774.722514005" lastFinishedPulling="2025-12-03 06:58:42.77482506 +0000 UTC m=+807.579723394" observedRunningTime="2025-12-03 06:58:45.233133101 +0000 UTC m=+810.038031445" watchObservedRunningTime="2025-12-03 06:58:45.235383924 +0000 UTC m=+810.040282248" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.253800 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55746cbb65-vl4j9"] Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.267663 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55746cbb65-vl4j9"] Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.431807 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.499337 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="194314aa-621d-43ab-94ba-62ba52e1208a" path="/var/lib/kubelet/pods/194314aa-621d-43ab-94ba-62ba52e1208a/volumes" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.499673 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd4f066-5f36-4e37-963d-8aa66bf9267c" path="/var/lib/kubelet/pods/4cd4f066-5f36-4e37-963d-8aa66bf9267c/volumes" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.500096 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b92cc5c-e27e-49bc-949c-7c98a208180a" path="/var/lib/kubelet/pods/8b92cc5c-e27e-49bc-949c-7c98a208180a/volumes" Dec 03 06:58:45 crc kubenswrapper[4475]: I1203 06:58:45.500520 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d57229df-2d5e-482f-bdeb-ae9b0a04b0ec" path="/var/lib/kubelet/pods/d57229df-2d5e-482f-bdeb-ae9b0a04b0ec/volumes" Dec 03 06:58:46 crc kubenswrapper[4475]: I1203 06:58:46.108819 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:46 crc kubenswrapper[4475]: I1203 06:58:46.216823 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:47 crc kubenswrapper[4475]: I1203 06:58:47.114820 4475 generic.go:334] "Generic (PLEG): container finished" podID="aaf991fb-6470-4403-bf72-660c8c5ce811" containerID="acb1c5092be513f35fc2d1d699e94d61ecb78127aff60bc832fa5230690b6474" exitCode=0 Dec 03 06:58:47 crc kubenswrapper[4475]: I1203 06:58:47.114905 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"aaf991fb-6470-4403-bf72-660c8c5ce811","Type":"ContainerDied","Data":"acb1c5092be513f35fc2d1d699e94d61ecb78127aff60bc832fa5230690b6474"} Dec 03 06:58:47 crc kubenswrapper[4475]: I1203 06:58:47.216974 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:47 crc kubenswrapper[4475]: I1203 06:58:47.245867 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:47 crc kubenswrapper[4475]: I1203 06:58:47.431124 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:48 crc kubenswrapper[4475]: I1203 06:58:48.121275 4475 generic.go:334] "Generic (PLEG): container finished" podID="47365d7f-1974-40d3-b6de-c033b41729ba" containerID="eef4b31bf419efb35a35eedadf8f31a03c38b8827fc87ccfb43007884c35e337" exitCode=0 Dec 03 06:58:48 crc kubenswrapper[4475]: I1203 06:58:48.121346 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"47365d7f-1974-40d3-b6de-c033b41729ba","Type":"ContainerDied","Data":"eef4b31bf419efb35a35eedadf8f31a03c38b8827fc87ccfb43007884c35e337"} Dec 03 06:58:48 crc kubenswrapper[4475]: I1203 06:58:48.151351 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 06:58:48 crc kubenswrapper[4475]: I1203 06:58:48.457557 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:48 crc kubenswrapper[4475]: I1203 06:58:48.755334 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.129292 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"aaf991fb-6470-4403-bf72-660c8c5ce811","Type":"ContainerStarted","Data":"137744a5bb3c827771b0b591a8d4e1892732f248613abfe1bf7ed314e6f520ed"} Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.130836 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"47365d7f-1974-40d3-b6de-c033b41729ba","Type":"ContainerStarted","Data":"dcf840e462f3690e98a3ec662984b02e126f557e3d2498e0fdc211f9df462bb0"} Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.158961 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.387786347 podStartE2EDuration="47.158947234s" podCreationTimestamp="2025-12-03 06:58:02 +0000 UTC" firstStartedPulling="2025-12-03 06:58:04.001438557 +0000 UTC m=+768.806336890" lastFinishedPulling="2025-12-03 06:58:42.772599442 +0000 UTC m=+807.577497777" observedRunningTime="2025-12-03 06:58:49.146057037 +0000 UTC m=+813.950955371" watchObservedRunningTime="2025-12-03 06:58:49.158947234 +0000 UTC m=+813.963845569" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.162165 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.162472 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.728363757 podStartE2EDuration="49.162466865s" podCreationTimestamp="2025-12-03 06:58:00 +0000 UTC" firstStartedPulling="2025-12-03 06:58:02.396171187 +0000 UTC m=+767.201069522" lastFinishedPulling="2025-12-03 06:58:42.830274296 +0000 UTC m=+807.635172630" observedRunningTime="2025-12-03 06:58:49.157949659 +0000 UTC m=+813.962848003" watchObservedRunningTime="2025-12-03 06:58:49.162466865 +0000 UTC m=+813.967365200" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.267266 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 06:58:49 crc kubenswrapper[4475]: E1203 06:58:49.267516 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b92cc5c-e27e-49bc-949c-7c98a208180a" containerName="init" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.267532 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b92cc5c-e27e-49bc-949c-7c98a208180a" containerName="init" Dec 03 06:58:49 crc kubenswrapper[4475]: E1203 06:58:49.267547 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57229df-2d5e-482f-bdeb-ae9b0a04b0ec" containerName="init" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.267552 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57229df-2d5e-482f-bdeb-ae9b0a04b0ec" containerName="init" Dec 03 06:58:49 crc kubenswrapper[4475]: E1203 06:58:49.267567 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd4f066-5f36-4e37-963d-8aa66bf9267c" containerName="init" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.267572 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd4f066-5f36-4e37-963d-8aa66bf9267c" containerName="init" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.267690 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b92cc5c-e27e-49bc-949c-7c98a208180a" containerName="init" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.267702 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd4f066-5f36-4e37-963d-8aa66bf9267c" containerName="init" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.267716 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57229df-2d5e-482f-bdeb-ae9b0a04b0ec" containerName="init" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.268348 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: W1203 06:58:49.270023 4475 reflector.go:561] object-"openstack"/"ovnnorthd-scripts": failed to list *v1.ConfigMap: configmaps "ovnnorthd-scripts" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 03 06:58:49 crc kubenswrapper[4475]: W1203 06:58:49.270049 4475 reflector.go:561] object-"openstack"/"cert-ovnnorthd-ovndbs": failed to list *v1.Secret: secrets "cert-ovnnorthd-ovndbs" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 03 06:58:49 crc kubenswrapper[4475]: E1203 06:58:49.270059 4475 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"ovnnorthd-scripts\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnnorthd-scripts\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 06:58:49 crc kubenswrapper[4475]: E1203 06:58:49.270100 4475 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cert-ovnnorthd-ovndbs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-ovnnorthd-ovndbs\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 06:58:49 crc kubenswrapper[4475]: W1203 06:58:49.270115 4475 reflector.go:561] object-"openstack"/"ovnnorthd-config": failed to list *v1.ConfigMap: configmaps "ovnnorthd-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 03 06:58:49 crc kubenswrapper[4475]: E1203 06:58:49.270128 4475 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"ovnnorthd-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnnorthd-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 06:58:49 crc kubenswrapper[4475]: W1203 06:58:49.270766 4475 reflector.go:561] object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-zj25h": failed to list *v1.Secret: secrets "ovnnorthd-ovnnorthd-dockercfg-zj25h" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 03 06:58:49 crc kubenswrapper[4475]: E1203 06:58:49.270876 4475 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"ovnnorthd-ovnnorthd-dockercfg-zj25h\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovnnorthd-ovnnorthd-dockercfg-zj25h\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.279070 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.337714 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.337745 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.337771 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbk6p\" (UniqueName: \"kubernetes.io/projected/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-kube-api-access-bbk6p\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.337787 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.337993 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.338108 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-config\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.338133 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-scripts\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.439297 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.439895 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-config\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.439920 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-scripts\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.439966 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.439981 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.440002 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbk6p\" (UniqueName: \"kubernetes.io/projected/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-kube-api-access-bbk6p\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.440016 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.440336 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.442440 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.442975 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:49 crc kubenswrapper[4475]: I1203 06:58:49.457495 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbk6p\" (UniqueName: \"kubernetes.io/projected/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-kube-api-access-bbk6p\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:50 crc kubenswrapper[4475]: I1203 06:58:50.350047 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 06:58:50 crc kubenswrapper[4475]: I1203 06:58:50.351419 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-scripts\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:50 crc kubenswrapper[4475]: E1203 06:58:50.441234 4475 secret.go:188] Couldn't get secret openstack/cert-ovnnorthd-ovndbs: failed to sync secret cache: timed out waiting for the condition Dec 03 06:58:50 crc kubenswrapper[4475]: E1203 06:58:50.441260 4475 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: failed to sync configmap cache: timed out waiting for the condition Dec 03 06:58:50 crc kubenswrapper[4475]: E1203 06:58:50.441308 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-ovn-northd-tls-certs podName:c0aa43be-dd4b-4b61-ba35-1a2bde22fba4 nodeName:}" failed. No retries permitted until 2025-12-03 06:58:50.941292069 +0000 UTC m=+815.746190403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "c0aa43be-dd4b-4b61-ba35-1a2bde22fba4") : failed to sync secret cache: timed out waiting for the condition Dec 03 06:58:50 crc kubenswrapper[4475]: E1203 06:58:50.441323 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-config podName:c0aa43be-dd4b-4b61-ba35-1a2bde22fba4 nodeName:}" failed. No retries permitted until 2025-12-03 06:58:50.941317356 +0000 UTC m=+815.746215691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-config") pod "ovn-northd-0" (UID: "c0aa43be-dd4b-4b61-ba35-1a2bde22fba4") : failed to sync configmap cache: timed out waiting for the condition Dec 03 06:58:50 crc kubenswrapper[4475]: I1203 06:58:50.466958 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-zj25h" Dec 03 06:58:50 crc kubenswrapper[4475]: I1203 06:58:50.676009 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 06:58:50 crc kubenswrapper[4475]: I1203 06:58:50.804442 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 06:58:50 crc kubenswrapper[4475]: I1203 06:58:50.958206 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-config\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:50 crc kubenswrapper[4475]: I1203 06:58:50.958270 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:50 crc kubenswrapper[4475]: I1203 06:58:50.958911 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-config\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:50 crc kubenswrapper[4475]: I1203 06:58:50.961744 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0aa43be-dd4b-4b61-ba35-1a2bde22fba4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4\") " pod="openstack/ovn-northd-0" Dec 03 06:58:51 crc kubenswrapper[4475]: I1203 06:58:51.087293 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 06:58:51 crc kubenswrapper[4475]: I1203 06:58:51.445911 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 06:58:51 crc kubenswrapper[4475]: I1203 06:58:51.458616 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:51 crc kubenswrapper[4475]: I1203 06:58:51.460093 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 06:58:51 crc kubenswrapper[4475]: I1203 06:58:51.609169 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:58:51 crc kubenswrapper[4475]: I1203 06:58:51.643201 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844556c789-r5js2"] Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.036939 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.036973 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.153968 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-844556c789-r5js2" podUID="48f10a7f-1ae5-49e6-924c-8de4068edaea" containerName="dnsmasq-dns" containerID="cri-o://c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e" gracePeriod=10 Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.154230 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4","Type":"ContainerStarted","Data":"21b0fc7bb8d6c9eb778fb0b681b62b44caf2040da94f73181b945b89cba5de9c"} Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.486148 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.577153 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkv4k\" (UniqueName: \"kubernetes.io/projected/48f10a7f-1ae5-49e6-924c-8de4068edaea-kube-api-access-kkv4k\") pod \"48f10a7f-1ae5-49e6-924c-8de4068edaea\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.577306 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-ovsdbserver-sb\") pod \"48f10a7f-1ae5-49e6-924c-8de4068edaea\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.577367 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-dns-svc\") pod \"48f10a7f-1ae5-49e6-924c-8de4068edaea\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.577400 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-config\") pod \"48f10a7f-1ae5-49e6-924c-8de4068edaea\" (UID: \"48f10a7f-1ae5-49e6-924c-8de4068edaea\") " Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.589946 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f10a7f-1ae5-49e6-924c-8de4068edaea-kube-api-access-kkv4k" (OuterVolumeSpecName: "kube-api-access-kkv4k") pod "48f10a7f-1ae5-49e6-924c-8de4068edaea" (UID: "48f10a7f-1ae5-49e6-924c-8de4068edaea"). InnerVolumeSpecName "kube-api-access-kkv4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.607942 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "48f10a7f-1ae5-49e6-924c-8de4068edaea" (UID: "48f10a7f-1ae5-49e6-924c-8de4068edaea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.608978 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "48f10a7f-1ae5-49e6-924c-8de4068edaea" (UID: "48f10a7f-1ae5-49e6-924c-8de4068edaea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.609697 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-config" (OuterVolumeSpecName: "config") pod "48f10a7f-1ae5-49e6-924c-8de4068edaea" (UID: "48f10a7f-1ae5-49e6-924c-8de4068edaea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.679305 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.679330 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.679338 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f10a7f-1ae5-49e6-924c-8de4068edaea-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:52 crc kubenswrapper[4475]: I1203 06:58:52.679347 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkv4k\" (UniqueName: \"kubernetes.io/projected/48f10a7f-1ae5-49e6-924c-8de4068edaea-kube-api-access-kkv4k\") on node \"crc\" DevicePath \"\"" Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.160300 4475 generic.go:334] "Generic (PLEG): container finished" podID="48f10a7f-1ae5-49e6-924c-8de4068edaea" containerID="c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e" exitCode=0 Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.160494 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844556c789-r5js2" event={"ID":"48f10a7f-1ae5-49e6-924c-8de4068edaea","Type":"ContainerDied","Data":"c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e"} Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.160547 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844556c789-r5js2" event={"ID":"48f10a7f-1ae5-49e6-924c-8de4068edaea","Type":"ContainerDied","Data":"4e9f0643951779d43743a52b51f6f1d806e97a9e1169844c23fa70a5d5395f45"} Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.160553 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844556c789-r5js2" Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.160564 4475 scope.go:117] "RemoveContainer" containerID="c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e" Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.173879 4475 scope.go:117] "RemoveContainer" containerID="47b000fe48e1c3f0bd782d4e4113dcc8e7e96a66e661ca7daea6a793a0595224" Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.184118 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844556c789-r5js2"] Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.188792 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-844556c789-r5js2"] Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.198504 4475 scope.go:117] "RemoveContainer" containerID="c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e" Dec 03 06:58:53 crc kubenswrapper[4475]: E1203 06:58:53.198856 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e\": container with ID starting with c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e not found: ID does not exist" containerID="c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e" Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.198893 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e"} err="failed to get container status \"c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e\": rpc error: code = NotFound desc = could not find container \"c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e\": container with ID starting with c25b2f5d96cc68c17d02b0f86038a61339eaf0a2988bb88c767b2f17f5f7382e not found: ID does not exist" Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.198911 4475 scope.go:117] "RemoveContainer" containerID="47b000fe48e1c3f0bd782d4e4113dcc8e7e96a66e661ca7daea6a793a0595224" Dec 03 06:58:53 crc kubenswrapper[4475]: E1203 06:58:53.199423 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b000fe48e1c3f0bd782d4e4113dcc8e7e96a66e661ca7daea6a793a0595224\": container with ID starting with 47b000fe48e1c3f0bd782d4e4113dcc8e7e96a66e661ca7daea6a793a0595224 not found: ID does not exist" containerID="47b000fe48e1c3f0bd782d4e4113dcc8e7e96a66e661ca7daea6a793a0595224" Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.199497 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b000fe48e1c3f0bd782d4e4113dcc8e7e96a66e661ca7daea6a793a0595224"} err="failed to get container status \"47b000fe48e1c3f0bd782d4e4113dcc8e7e96a66e661ca7daea6a793a0595224\": rpc error: code = NotFound desc = could not find container \"47b000fe48e1c3f0bd782d4e4113dcc8e7e96a66e661ca7daea6a793a0595224\": container with ID starting with 47b000fe48e1c3f0bd782d4e4113dcc8e7e96a66e661ca7daea6a793a0595224 not found: ID does not exist" Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.384333 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.384521 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.436429 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:53 crc kubenswrapper[4475]: I1203 06:58:53.497362 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f10a7f-1ae5-49e6-924c-8de4068edaea" path="/var/lib/kubelet/pods/48f10a7f-1ae5-49e6-924c-8de4068edaea/volumes" Dec 03 06:58:54 crc kubenswrapper[4475]: I1203 06:58:54.219919 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.531723 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6476ddd6b5-ppjlc"] Dec 03 06:58:55 crc kubenswrapper[4475]: E1203 06:58:55.531985 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f10a7f-1ae5-49e6-924c-8de4068edaea" containerName="init" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.531998 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f10a7f-1ae5-49e6-924c-8de4068edaea" containerName="init" Dec 03 06:58:55 crc kubenswrapper[4475]: E1203 06:58:55.532019 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f10a7f-1ae5-49e6-924c-8de4068edaea" containerName="dnsmasq-dns" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.532025 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f10a7f-1ae5-49e6-924c-8de4068edaea" containerName="dnsmasq-dns" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.532154 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f10a7f-1ae5-49e6-924c-8de4068edaea" containerName="dnsmasq-dns" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.532792 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.555697 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6476ddd6b5-ppjlc"] Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.614666 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xftn2\" (UniqueName: \"kubernetes.io/projected/87fdba6a-e11c-4bbd-becc-78999065efa8-kube-api-access-xftn2\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.615127 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-sb\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.615245 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-nb\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.615346 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-dns-svc\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.615441 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-config\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.717004 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-sb\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.717228 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-nb\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.717373 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-dns-svc\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.717481 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-config\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.717577 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xftn2\" (UniqueName: \"kubernetes.io/projected/87fdba6a-e11c-4bbd-becc-78999065efa8-kube-api-access-xftn2\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.717824 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-sb\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.718345 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-dns-svc\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.718702 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-config\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.718851 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-nb\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.739615 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xftn2\" (UniqueName: \"kubernetes.io/projected/87fdba6a-e11c-4bbd-becc-78999065efa8-kube-api-access-xftn2\") pod \"dnsmasq-dns-6476ddd6b5-ppjlc\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:55 crc kubenswrapper[4475]: I1203 06:58:55.845857 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.092967 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.144863 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.258850 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6476ddd6b5-ppjlc"] Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.640016 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.648921 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.651742 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-z62rp" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.651826 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.651853 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.651907 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.655787 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.830691 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztjzh\" (UniqueName: \"kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-kube-api-access-ztjzh\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.831121 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4b95df68-1a9d-403e-ab8f-87335fd821fe-cache\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.831227 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4b95df68-1a9d-403e-ab8f-87335fd821fe-lock\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.831366 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.831489 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.932672 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.932979 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.933065 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztjzh\" (UniqueName: \"kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-kube-api-access-ztjzh\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.933145 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4b95df68-1a9d-403e-ab8f-87335fd821fe-cache\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.933227 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4b95df68-1a9d-403e-ab8f-87335fd821fe-lock\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.933261 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.933599 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4b95df68-1a9d-403e-ab8f-87335fd821fe-cache\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: E1203 06:58:56.932809 4475 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 06:58:56 crc kubenswrapper[4475]: E1203 06:58:56.933648 4475 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.933646 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4b95df68-1a9d-403e-ab8f-87335fd821fe-lock\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: E1203 06:58:56.933685 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift podName:4b95df68-1a9d-403e-ab8f-87335fd821fe nodeName:}" failed. No retries permitted until 2025-12-03 06:58:57.433671658 +0000 UTC m=+822.238569992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift") pod "swift-storage-0" (UID: "4b95df68-1a9d-403e-ab8f-87335fd821fe") : configmap "swift-ring-files" not found Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.946748 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztjzh\" (UniqueName: \"kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-kube-api-access-ztjzh\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:56 crc kubenswrapper[4475]: I1203 06:58:56.948270 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.178870 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-c2ch7"] Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.180010 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.183158 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.184055 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.184844 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.186751 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4","Type":"ContainerStarted","Data":"7561b0532197a8e4858f0e53472b76edf6dd5767dc504cb2d57198f8be181856"} Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.186782 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c0aa43be-dd4b-4b61-ba35-1a2bde22fba4","Type":"ContainerStarted","Data":"65a09d29a2dd0e7158706c9d8b37e7d6ce8f5bb789208a5dbf2114f952686cce"} Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.187033 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.188232 4475 generic.go:334] "Generic (PLEG): container finished" podID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerID="d949ab98dfd9a323f3d5eb5ecfde99a9124e11883d30d48d6554020c1d19f95e" exitCode=0 Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.188354 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" event={"ID":"87fdba6a-e11c-4bbd-becc-78999065efa8","Type":"ContainerDied","Data":"d949ab98dfd9a323f3d5eb5ecfde99a9124e11883d30d48d6554020c1d19f95e"} Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.188422 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" event={"ID":"87fdba6a-e11c-4bbd-becc-78999065efa8","Type":"ContainerStarted","Data":"b1c7348b3dcf43bba141fa6fbd5a70c4f964c989b2fad34d59d86a04a413bc3a"} Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.207227 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-c2ch7"] Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.275935 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.112849618 podStartE2EDuration="8.275919407s" podCreationTimestamp="2025-12-03 06:58:49 +0000 UTC" firstStartedPulling="2025-12-03 06:58:51.459890765 +0000 UTC m=+816.264789098" lastFinishedPulling="2025-12-03 06:58:56.622960554 +0000 UTC m=+821.427858887" observedRunningTime="2025-12-03 06:58:57.270103527 +0000 UTC m=+822.075001861" watchObservedRunningTime="2025-12-03 06:58:57.275919407 +0000 UTC m=+822.080817740" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.341134 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fpzb\" (UniqueName: \"kubernetes.io/projected/084fe7c1-e14b-4a01-8f56-5b0e545b9888-kube-api-access-5fpzb\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.341192 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-dispersionconf\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.341212 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/084fe7c1-e14b-4a01-8f56-5b0e545b9888-etc-swift\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.341238 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-scripts\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.341291 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-combined-ca-bundle\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.341331 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-swiftconf\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.341386 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-ring-data-devices\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.442582 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fpzb\" (UniqueName: \"kubernetes.io/projected/084fe7c1-e14b-4a01-8f56-5b0e545b9888-kube-api-access-5fpzb\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.442649 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-dispersionconf\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.442669 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/084fe7c1-e14b-4a01-8f56-5b0e545b9888-etc-swift\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.442702 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-scripts\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.442720 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.442748 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-combined-ca-bundle\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.442795 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-swiftconf\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.442826 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-ring-data-devices\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: E1203 06:58:57.443254 4475 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 06:58:57 crc kubenswrapper[4475]: E1203 06:58:57.443283 4475 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 06:58:57 crc kubenswrapper[4475]: E1203 06:58:57.443334 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift podName:4b95df68-1a9d-403e-ab8f-87335fd821fe nodeName:}" failed. No retries permitted until 2025-12-03 06:58:58.443317132 +0000 UTC m=+823.248215466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift") pod "swift-storage-0" (UID: "4b95df68-1a9d-403e-ab8f-87335fd821fe") : configmap "swift-ring-files" not found Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.443481 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/084fe7c1-e14b-4a01-8f56-5b0e545b9888-etc-swift\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.443592 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-ring-data-devices\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.443592 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-scripts\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.446732 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-dispersionconf\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.446831 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-swiftconf\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.447023 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-combined-ca-bundle\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.456030 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fpzb\" (UniqueName: \"kubernetes.io/projected/084fe7c1-e14b-4a01-8f56-5b0e545b9888-kube-api-access-5fpzb\") pod \"swift-ring-rebalance-c2ch7\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.500274 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:58:57 crc kubenswrapper[4475]: I1203 06:58:57.859161 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-c2ch7"] Dec 03 06:58:58 crc kubenswrapper[4475]: I1203 06:58:58.194303 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c2ch7" event={"ID":"084fe7c1-e14b-4a01-8f56-5b0e545b9888","Type":"ContainerStarted","Data":"19e2ceea10ca96be27b9fbe95a8c77cc3c23b2581d1a53757347fd45c26e6edd"} Dec 03 06:58:58 crc kubenswrapper[4475]: I1203 06:58:58.196360 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" event={"ID":"87fdba6a-e11c-4bbd-becc-78999065efa8","Type":"ContainerStarted","Data":"c9d9f24e460f40acbdabd6a1c9925db16f56c53012d32344327cd5e2a4943c14"} Dec 03 06:58:58 crc kubenswrapper[4475]: I1203 06:58:58.211706 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" podStartSLOduration=3.21169379 podStartE2EDuration="3.21169379s" podCreationTimestamp="2025-12-03 06:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:58:58.206236415 +0000 UTC m=+823.011134749" watchObservedRunningTime="2025-12-03 06:58:58.21169379 +0000 UTC m=+823.016592124" Dec 03 06:58:58 crc kubenswrapper[4475]: I1203 06:58:58.455671 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:58:58 crc kubenswrapper[4475]: E1203 06:58:58.455797 4475 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 06:58:58 crc kubenswrapper[4475]: E1203 06:58:58.455817 4475 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 06:58:58 crc kubenswrapper[4475]: E1203 06:58:58.455859 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift podName:4b95df68-1a9d-403e-ab8f-87335fd821fe nodeName:}" failed. No retries permitted until 2025-12-03 06:59:00.455846981 +0000 UTC m=+825.260745315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift") pod "swift-storage-0" (UID: "4b95df68-1a9d-403e-ab8f-87335fd821fe") : configmap "swift-ring-files" not found Dec 03 06:58:58 crc kubenswrapper[4475]: I1203 06:58:58.933856 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:58:58 crc kubenswrapper[4475]: I1203 06:58:58.934086 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:58:58 crc kubenswrapper[4475]: I1203 06:58:58.943945 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-nvtnr"] Dec 03 06:58:58 crc kubenswrapper[4475]: I1203 06:58:58.944822 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nvtnr" Dec 03 06:58:58 crc kubenswrapper[4475]: I1203 06:58:58.954316 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nvtnr"] Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.062143 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6p4t\" (UniqueName: \"kubernetes.io/projected/95381607-f11e-478e-b1fc-7ea77035f03f-kube-api-access-n6p4t\") pod \"glance-db-create-nvtnr\" (UID: \"95381607-f11e-478e-b1fc-7ea77035f03f\") " pod="openstack/glance-db-create-nvtnr" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.062665 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95381607-f11e-478e-b1fc-7ea77035f03f-operator-scripts\") pod \"glance-db-create-nvtnr\" (UID: \"95381607-f11e-478e-b1fc-7ea77035f03f\") " pod="openstack/glance-db-create-nvtnr" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.067504 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b67d-account-create-update-xx754"] Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.068290 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b67d-account-create-update-xx754" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.070929 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.085279 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b67d-account-create-update-xx754"] Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.164282 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95381607-f11e-478e-b1fc-7ea77035f03f-operator-scripts\") pod \"glance-db-create-nvtnr\" (UID: \"95381607-f11e-478e-b1fc-7ea77035f03f\") " pod="openstack/glance-db-create-nvtnr" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.164571 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snrzp\" (UniqueName: \"kubernetes.io/projected/45c3aef8-81fe-4320-afbd-c83eb00e861a-kube-api-access-snrzp\") pod \"glance-b67d-account-create-update-xx754\" (UID: \"45c3aef8-81fe-4320-afbd-c83eb00e861a\") " pod="openstack/glance-b67d-account-create-update-xx754" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.164664 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6p4t\" (UniqueName: \"kubernetes.io/projected/95381607-f11e-478e-b1fc-7ea77035f03f-kube-api-access-n6p4t\") pod \"glance-db-create-nvtnr\" (UID: \"95381607-f11e-478e-b1fc-7ea77035f03f\") " pod="openstack/glance-db-create-nvtnr" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.164696 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c3aef8-81fe-4320-afbd-c83eb00e861a-operator-scripts\") pod \"glance-b67d-account-create-update-xx754\" (UID: \"45c3aef8-81fe-4320-afbd-c83eb00e861a\") " pod="openstack/glance-b67d-account-create-update-xx754" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.164971 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95381607-f11e-478e-b1fc-7ea77035f03f-operator-scripts\") pod \"glance-db-create-nvtnr\" (UID: \"95381607-f11e-478e-b1fc-7ea77035f03f\") " pod="openstack/glance-db-create-nvtnr" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.183873 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6p4t\" (UniqueName: \"kubernetes.io/projected/95381607-f11e-478e-b1fc-7ea77035f03f-kube-api-access-n6p4t\") pod \"glance-db-create-nvtnr\" (UID: \"95381607-f11e-478e-b1fc-7ea77035f03f\") " pod="openstack/glance-db-create-nvtnr" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.203962 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.258361 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nvtnr" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.265487 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c3aef8-81fe-4320-afbd-c83eb00e861a-operator-scripts\") pod \"glance-b67d-account-create-update-xx754\" (UID: \"45c3aef8-81fe-4320-afbd-c83eb00e861a\") " pod="openstack/glance-b67d-account-create-update-xx754" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.265583 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snrzp\" (UniqueName: \"kubernetes.io/projected/45c3aef8-81fe-4320-afbd-c83eb00e861a-kube-api-access-snrzp\") pod \"glance-b67d-account-create-update-xx754\" (UID: \"45c3aef8-81fe-4320-afbd-c83eb00e861a\") " pod="openstack/glance-b67d-account-create-update-xx754" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.266383 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c3aef8-81fe-4320-afbd-c83eb00e861a-operator-scripts\") pod \"glance-b67d-account-create-update-xx754\" (UID: \"45c3aef8-81fe-4320-afbd-c83eb00e861a\") " pod="openstack/glance-b67d-account-create-update-xx754" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.278503 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snrzp\" (UniqueName: \"kubernetes.io/projected/45c3aef8-81fe-4320-afbd-c83eb00e861a-kube-api-access-snrzp\") pod \"glance-b67d-account-create-update-xx754\" (UID: \"45c3aef8-81fe-4320-afbd-c83eb00e861a\") " pod="openstack/glance-b67d-account-create-update-xx754" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.379559 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b67d-account-create-update-xx754" Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.633585 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nvtnr"] Dec 03 06:58:59 crc kubenswrapper[4475]: W1203 06:58:59.641571 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95381607_f11e_478e_b1fc_7ea77035f03f.slice/crio-ea227f075f6f8d1de59c4e4d936490639a2f75f784f05305f179809cb71ab46c WatchSource:0}: Error finding container ea227f075f6f8d1de59c4e4d936490639a2f75f784f05305f179809cb71ab46c: Status 404 returned error can't find the container with id ea227f075f6f8d1de59c4e4d936490639a2f75f784f05305f179809cb71ab46c Dec 03 06:58:59 crc kubenswrapper[4475]: I1203 06:58:59.775600 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b67d-account-create-update-xx754"] Dec 03 06:58:59 crc kubenswrapper[4475]: W1203 06:58:59.778012 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45c3aef8_81fe_4320_afbd_c83eb00e861a.slice/crio-7aa446904e4ebe67d0f56252ff43d85581ee89001d40ebd469b74da709fcee8a WatchSource:0}: Error finding container 7aa446904e4ebe67d0f56252ff43d85581ee89001d40ebd469b74da709fcee8a: Status 404 returned error can't find the container with id 7aa446904e4ebe67d0f56252ff43d85581ee89001d40ebd469b74da709fcee8a Dec 03 06:59:00 crc kubenswrapper[4475]: I1203 06:59:00.211516 4475 generic.go:334] "Generic (PLEG): container finished" podID="45c3aef8-81fe-4320-afbd-c83eb00e861a" containerID="7504efc45e27104247e8090104bccba3a0eb453488cfda69abdd899d8b13c4bc" exitCode=0 Dec 03 06:59:00 crc kubenswrapper[4475]: I1203 06:59:00.211557 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b67d-account-create-update-xx754" event={"ID":"45c3aef8-81fe-4320-afbd-c83eb00e861a","Type":"ContainerDied","Data":"7504efc45e27104247e8090104bccba3a0eb453488cfda69abdd899d8b13c4bc"} Dec 03 06:59:00 crc kubenswrapper[4475]: I1203 06:59:00.211594 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b67d-account-create-update-xx754" event={"ID":"45c3aef8-81fe-4320-afbd-c83eb00e861a","Type":"ContainerStarted","Data":"7aa446904e4ebe67d0f56252ff43d85581ee89001d40ebd469b74da709fcee8a"} Dec 03 06:59:00 crc kubenswrapper[4475]: I1203 06:59:00.214488 4475 generic.go:334] "Generic (PLEG): container finished" podID="95381607-f11e-478e-b1fc-7ea77035f03f" containerID="3a98eb016a766e7141bf0c937f80c29768bca8ce334972bcb076f1ee0d04a8c5" exitCode=0 Dec 03 06:59:00 crc kubenswrapper[4475]: I1203 06:59:00.215277 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nvtnr" event={"ID":"95381607-f11e-478e-b1fc-7ea77035f03f","Type":"ContainerDied","Data":"3a98eb016a766e7141bf0c937f80c29768bca8ce334972bcb076f1ee0d04a8c5"} Dec 03 06:59:00 crc kubenswrapper[4475]: I1203 06:59:00.215302 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nvtnr" event={"ID":"95381607-f11e-478e-b1fc-7ea77035f03f","Type":"ContainerStarted","Data":"ea227f075f6f8d1de59c4e4d936490639a2f75f784f05305f179809cb71ab46c"} Dec 03 06:59:00 crc kubenswrapper[4475]: I1203 06:59:00.487330 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:59:00 crc kubenswrapper[4475]: E1203 06:59:00.487508 4475 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 06:59:00 crc kubenswrapper[4475]: E1203 06:59:00.487529 4475 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 06:59:00 crc kubenswrapper[4475]: E1203 06:59:00.487568 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift podName:4b95df68-1a9d-403e-ab8f-87335fd821fe nodeName:}" failed. No retries permitted until 2025-12-03 06:59:04.487555518 +0000 UTC m=+829.292453852 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift") pod "swift-storage-0" (UID: "4b95df68-1a9d-403e-ab8f-87335fd821fe") : configmap "swift-ring-files" not found Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.171126 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nvtnr" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.189028 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b67d-account-create-update-xx754" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.212695 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c3aef8-81fe-4320-afbd-c83eb00e861a-operator-scripts\") pod \"45c3aef8-81fe-4320-afbd-c83eb00e861a\" (UID: \"45c3aef8-81fe-4320-afbd-c83eb00e861a\") " Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.212733 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95381607-f11e-478e-b1fc-7ea77035f03f-operator-scripts\") pod \"95381607-f11e-478e-b1fc-7ea77035f03f\" (UID: \"95381607-f11e-478e-b1fc-7ea77035f03f\") " Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.212761 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6p4t\" (UniqueName: \"kubernetes.io/projected/95381607-f11e-478e-b1fc-7ea77035f03f-kube-api-access-n6p4t\") pod \"95381607-f11e-478e-b1fc-7ea77035f03f\" (UID: \"95381607-f11e-478e-b1fc-7ea77035f03f\") " Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.212790 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snrzp\" (UniqueName: \"kubernetes.io/projected/45c3aef8-81fe-4320-afbd-c83eb00e861a-kube-api-access-snrzp\") pod \"45c3aef8-81fe-4320-afbd-c83eb00e861a\" (UID: \"45c3aef8-81fe-4320-afbd-c83eb00e861a\") " Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.213186 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45c3aef8-81fe-4320-afbd-c83eb00e861a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45c3aef8-81fe-4320-afbd-c83eb00e861a" (UID: "45c3aef8-81fe-4320-afbd-c83eb00e861a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.213219 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95381607-f11e-478e-b1fc-7ea77035f03f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95381607-f11e-478e-b1fc-7ea77035f03f" (UID: "95381607-f11e-478e-b1fc-7ea77035f03f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.218599 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c3aef8-81fe-4320-afbd-c83eb00e861a-kube-api-access-snrzp" (OuterVolumeSpecName: "kube-api-access-snrzp") pod "45c3aef8-81fe-4320-afbd-c83eb00e861a" (UID: "45c3aef8-81fe-4320-afbd-c83eb00e861a"). InnerVolumeSpecName "kube-api-access-snrzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.221752 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95381607-f11e-478e-b1fc-7ea77035f03f-kube-api-access-n6p4t" (OuterVolumeSpecName: "kube-api-access-n6p4t") pod "95381607-f11e-478e-b1fc-7ea77035f03f" (UID: "95381607-f11e-478e-b1fc-7ea77035f03f"). InnerVolumeSpecName "kube-api-access-n6p4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.241288 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b67d-account-create-update-xx754" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.241326 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b67d-account-create-update-xx754" event={"ID":"45c3aef8-81fe-4320-afbd-c83eb00e861a","Type":"ContainerDied","Data":"7aa446904e4ebe67d0f56252ff43d85581ee89001d40ebd469b74da709fcee8a"} Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.241354 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aa446904e4ebe67d0f56252ff43d85581ee89001d40ebd469b74da709fcee8a" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.242750 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nvtnr" event={"ID":"95381607-f11e-478e-b1fc-7ea77035f03f","Type":"ContainerDied","Data":"ea227f075f6f8d1de59c4e4d936490639a2f75f784f05305f179809cb71ab46c"} Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.242768 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea227f075f6f8d1de59c4e4d936490639a2f75f784f05305f179809cb71ab46c" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.242834 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nvtnr" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.314715 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c3aef8-81fe-4320-afbd-c83eb00e861a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.314739 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95381607-f11e-478e-b1fc-7ea77035f03f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.314751 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6p4t\" (UniqueName: \"kubernetes.io/projected/95381607-f11e-478e-b1fc-7ea77035f03f-kube-api-access-n6p4t\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:02 crc kubenswrapper[4475]: I1203 06:59:02.314759 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snrzp\" (UniqueName: \"kubernetes.io/projected/45c3aef8-81fe-4320-afbd-c83eb00e861a-kube-api-access-snrzp\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.248165 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c2ch7" event={"ID":"084fe7c1-e14b-4a01-8f56-5b0e545b9888","Type":"ContainerStarted","Data":"5a215c949522c8310ad42d865f0c169e2f63dc357cca84cf0df7ddcd4873ffbe"} Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.263554 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-c2ch7" podStartSLOduration=2.11790945 podStartE2EDuration="6.263542791s" podCreationTimestamp="2025-12-03 06:58:57 +0000 UTC" firstStartedPulling="2025-12-03 06:58:57.878750745 +0000 UTC m=+822.683649079" lastFinishedPulling="2025-12-03 06:59:02.024384085 +0000 UTC m=+826.829282420" observedRunningTime="2025-12-03 06:59:03.258206293 +0000 UTC m=+828.063104627" watchObservedRunningTime="2025-12-03 06:59:03.263542791 +0000 UTC m=+828.068441125" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.274381 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-5b8fz"] Dec 03 06:59:03 crc kubenswrapper[4475]: E1203 06:59:03.274650 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95381607-f11e-478e-b1fc-7ea77035f03f" containerName="mariadb-database-create" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.274667 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="95381607-f11e-478e-b1fc-7ea77035f03f" containerName="mariadb-database-create" Dec 03 06:59:03 crc kubenswrapper[4475]: E1203 06:59:03.274683 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c3aef8-81fe-4320-afbd-c83eb00e861a" containerName="mariadb-account-create-update" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.274688 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c3aef8-81fe-4320-afbd-c83eb00e861a" containerName="mariadb-account-create-update" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.274842 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c3aef8-81fe-4320-afbd-c83eb00e861a" containerName="mariadb-account-create-update" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.274856 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="95381607-f11e-478e-b1fc-7ea77035f03f" containerName="mariadb-database-create" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.275274 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5b8fz" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.281368 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5b8fz"] Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.328242 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9gm8\" (UniqueName: \"kubernetes.io/projected/e76aadcc-0222-442c-8c9a-d0b197c92978-kube-api-access-z9gm8\") pod \"keystone-db-create-5b8fz\" (UID: \"e76aadcc-0222-442c-8c9a-d0b197c92978\") " pod="openstack/keystone-db-create-5b8fz" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.328339 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76aadcc-0222-442c-8c9a-d0b197c92978-operator-scripts\") pod \"keystone-db-create-5b8fz\" (UID: \"e76aadcc-0222-442c-8c9a-d0b197c92978\") " pod="openstack/keystone-db-create-5b8fz" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.379813 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3d6e-account-create-update-gtfgv"] Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.381216 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d6e-account-create-update-gtfgv" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.382838 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.405471 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3d6e-account-create-update-gtfgv"] Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.429189 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-operator-scripts\") pod \"keystone-3d6e-account-create-update-gtfgv\" (UID: \"f131782f-4d8b-48fb-9eff-1a2c1f7b859b\") " pod="openstack/keystone-3d6e-account-create-update-gtfgv" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.429302 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9gm8\" (UniqueName: \"kubernetes.io/projected/e76aadcc-0222-442c-8c9a-d0b197c92978-kube-api-access-z9gm8\") pod \"keystone-db-create-5b8fz\" (UID: \"e76aadcc-0222-442c-8c9a-d0b197c92978\") " pod="openstack/keystone-db-create-5b8fz" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.429405 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrv7p\" (UniqueName: \"kubernetes.io/projected/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-kube-api-access-lrv7p\") pod \"keystone-3d6e-account-create-update-gtfgv\" (UID: \"f131782f-4d8b-48fb-9eff-1a2c1f7b859b\") " pod="openstack/keystone-3d6e-account-create-update-gtfgv" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.429560 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76aadcc-0222-442c-8c9a-d0b197c92978-operator-scripts\") pod \"keystone-db-create-5b8fz\" (UID: \"e76aadcc-0222-442c-8c9a-d0b197c92978\") " pod="openstack/keystone-db-create-5b8fz" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.430310 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76aadcc-0222-442c-8c9a-d0b197c92978-operator-scripts\") pod \"keystone-db-create-5b8fz\" (UID: \"e76aadcc-0222-442c-8c9a-d0b197c92978\") " pod="openstack/keystone-db-create-5b8fz" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.448505 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9gm8\" (UniqueName: \"kubernetes.io/projected/e76aadcc-0222-442c-8c9a-d0b197c92978-kube-api-access-z9gm8\") pod \"keystone-db-create-5b8fz\" (UID: \"e76aadcc-0222-442c-8c9a-d0b197c92978\") " pod="openstack/keystone-db-create-5b8fz" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.531536 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-operator-scripts\") pod \"keystone-3d6e-account-create-update-gtfgv\" (UID: \"f131782f-4d8b-48fb-9eff-1a2c1f7b859b\") " pod="openstack/keystone-3d6e-account-create-update-gtfgv" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.531843 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrv7p\" (UniqueName: \"kubernetes.io/projected/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-kube-api-access-lrv7p\") pod \"keystone-3d6e-account-create-update-gtfgv\" (UID: \"f131782f-4d8b-48fb-9eff-1a2c1f7b859b\") " pod="openstack/keystone-3d6e-account-create-update-gtfgv" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.533220 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-operator-scripts\") pod \"keystone-3d6e-account-create-update-gtfgv\" (UID: \"f131782f-4d8b-48fb-9eff-1a2c1f7b859b\") " pod="openstack/keystone-3d6e-account-create-update-gtfgv" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.562079 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrv7p\" (UniqueName: \"kubernetes.io/projected/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-kube-api-access-lrv7p\") pod \"keystone-3d6e-account-create-update-gtfgv\" (UID: \"f131782f-4d8b-48fb-9eff-1a2c1f7b859b\") " pod="openstack/keystone-3d6e-account-create-update-gtfgv" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.588365 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5b8fz" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.634418 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6dfhs"] Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.635717 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6dfhs" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.652863 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6dfhs"] Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.706088 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d6e-account-create-update-gtfgv" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.738963 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9276d039-20eb-4651-b987-97f393cbc59a-operator-scripts\") pod \"placement-db-create-6dfhs\" (UID: \"9276d039-20eb-4651-b987-97f393cbc59a\") " pod="openstack/placement-db-create-6dfhs" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.739140 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpwqt\" (UniqueName: \"kubernetes.io/projected/9276d039-20eb-4651-b987-97f393cbc59a-kube-api-access-kpwqt\") pod \"placement-db-create-6dfhs\" (UID: \"9276d039-20eb-4651-b987-97f393cbc59a\") " pod="openstack/placement-db-create-6dfhs" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.749942 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-33d5-account-create-update-h4hzd"] Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.751351 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-33d5-account-create-update-h4hzd" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.753709 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.757197 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-33d5-account-create-update-h4hzd"] Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.841126 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b649c00-045b-474a-aa22-71e06f5d454f-operator-scripts\") pod \"placement-33d5-account-create-update-h4hzd\" (UID: \"1b649c00-045b-474a-aa22-71e06f5d454f\") " pod="openstack/placement-33d5-account-create-update-h4hzd" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.841462 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9txsk\" (UniqueName: \"kubernetes.io/projected/1b649c00-045b-474a-aa22-71e06f5d454f-kube-api-access-9txsk\") pod \"placement-33d5-account-create-update-h4hzd\" (UID: \"1b649c00-045b-474a-aa22-71e06f5d454f\") " pod="openstack/placement-33d5-account-create-update-h4hzd" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.841514 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpwqt\" (UniqueName: \"kubernetes.io/projected/9276d039-20eb-4651-b987-97f393cbc59a-kube-api-access-kpwqt\") pod \"placement-db-create-6dfhs\" (UID: \"9276d039-20eb-4651-b987-97f393cbc59a\") " pod="openstack/placement-db-create-6dfhs" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.841729 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9276d039-20eb-4651-b987-97f393cbc59a-operator-scripts\") pod \"placement-db-create-6dfhs\" (UID: \"9276d039-20eb-4651-b987-97f393cbc59a\") " pod="openstack/placement-db-create-6dfhs" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.842361 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9276d039-20eb-4651-b987-97f393cbc59a-operator-scripts\") pod \"placement-db-create-6dfhs\" (UID: \"9276d039-20eb-4651-b987-97f393cbc59a\") " pod="openstack/placement-db-create-6dfhs" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.856184 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpwqt\" (UniqueName: \"kubernetes.io/projected/9276d039-20eb-4651-b987-97f393cbc59a-kube-api-access-kpwqt\") pod \"placement-db-create-6dfhs\" (UID: \"9276d039-20eb-4651-b987-97f393cbc59a\") " pod="openstack/placement-db-create-6dfhs" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.945612 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b649c00-045b-474a-aa22-71e06f5d454f-operator-scripts\") pod \"placement-33d5-account-create-update-h4hzd\" (UID: \"1b649c00-045b-474a-aa22-71e06f5d454f\") " pod="openstack/placement-33d5-account-create-update-h4hzd" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.945706 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9txsk\" (UniqueName: \"kubernetes.io/projected/1b649c00-045b-474a-aa22-71e06f5d454f-kube-api-access-9txsk\") pod \"placement-33d5-account-create-update-h4hzd\" (UID: \"1b649c00-045b-474a-aa22-71e06f5d454f\") " pod="openstack/placement-33d5-account-create-update-h4hzd" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.946602 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b649c00-045b-474a-aa22-71e06f5d454f-operator-scripts\") pod \"placement-33d5-account-create-update-h4hzd\" (UID: \"1b649c00-045b-474a-aa22-71e06f5d454f\") " pod="openstack/placement-33d5-account-create-update-h4hzd" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.961878 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9txsk\" (UniqueName: \"kubernetes.io/projected/1b649c00-045b-474a-aa22-71e06f5d454f-kube-api-access-9txsk\") pod \"placement-33d5-account-create-update-h4hzd\" (UID: \"1b649c00-045b-474a-aa22-71e06f5d454f\") " pod="openstack/placement-33d5-account-create-update-h4hzd" Dec 03 06:59:03 crc kubenswrapper[4475]: I1203 06:59:03.973587 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6dfhs" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.015391 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5b8fz"] Dec 03 06:59:04 crc kubenswrapper[4475]: W1203 06:59:04.021978 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode76aadcc_0222_442c_8c9a_d0b197c92978.slice/crio-a691d6c78330ed7b84e15661f25e1b7f3f50a8edece09b1cf5d3ddb4ceba80d1 WatchSource:0}: Error finding container a691d6c78330ed7b84e15661f25e1b7f3f50a8edece09b1cf5d3ddb4ceba80d1: Status 404 returned error can't find the container with id a691d6c78330ed7b84e15661f25e1b7f3f50a8edece09b1cf5d3ddb4ceba80d1 Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.075232 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-33d5-account-create-update-h4hzd" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.113422 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3d6e-account-create-update-gtfgv"] Dec 03 06:59:04 crc kubenswrapper[4475]: W1203 06:59:04.118444 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf131782f_4d8b_48fb_9eff_1a2c1f7b859b.slice/crio-e4bce4960e59b7008a5c2e2e7b704c5352bcaebac87e6da63dc18ad62a7eae4d WatchSource:0}: Error finding container e4bce4960e59b7008a5c2e2e7b704c5352bcaebac87e6da63dc18ad62a7eae4d: Status 404 returned error can't find the container with id e4bce4960e59b7008a5c2e2e7b704c5352bcaebac87e6da63dc18ad62a7eae4d Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.217967 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-psvbz"] Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.218800 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.224209 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wkbtp" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.224368 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.236047 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-psvbz"] Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.260915 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d6e-account-create-update-gtfgv" event={"ID":"f131782f-4d8b-48fb-9eff-1a2c1f7b859b","Type":"ContainerStarted","Data":"e4bce4960e59b7008a5c2e2e7b704c5352bcaebac87e6da63dc18ad62a7eae4d"} Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.280642 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5b8fz" event={"ID":"e76aadcc-0222-442c-8c9a-d0b197c92978","Type":"ContainerStarted","Data":"a691d6c78330ed7b84e15661f25e1b7f3f50a8edece09b1cf5d3ddb4ceba80d1"} Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.353046 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-config-data\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.353098 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-combined-ca-bundle\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.353119 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-db-sync-config-data\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.353583 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jrtc\" (UniqueName: \"kubernetes.io/projected/3e136d3d-bbe5-44b0-ad48-4e560507aeac-kube-api-access-5jrtc\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.363636 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6dfhs"] Dec 03 06:59:04 crc kubenswrapper[4475]: W1203 06:59:04.365901 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9276d039_20eb_4651_b987_97f393cbc59a.slice/crio-8626344add073cd68020ca4ce784a20120f59cc699060a958810c90fecfdc181 WatchSource:0}: Error finding container 8626344add073cd68020ca4ce784a20120f59cc699060a958810c90fecfdc181: Status 404 returned error can't find the container with id 8626344add073cd68020ca4ce784a20120f59cc699060a958810c90fecfdc181 Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.455664 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jrtc\" (UniqueName: \"kubernetes.io/projected/3e136d3d-bbe5-44b0-ad48-4e560507aeac-kube-api-access-5jrtc\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.455751 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-config-data\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.455792 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-combined-ca-bundle\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.455808 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-db-sync-config-data\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.460192 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-config-data\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.460445 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-combined-ca-bundle\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.460517 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-db-sync-config-data\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.468912 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jrtc\" (UniqueName: \"kubernetes.io/projected/3e136d3d-bbe5-44b0-ad48-4e560507aeac-kube-api-access-5jrtc\") pod \"glance-db-sync-psvbz\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.501035 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-33d5-account-create-update-h4hzd"] Dec 03 06:59:04 crc kubenswrapper[4475]: W1203 06:59:04.510674 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b649c00_045b_474a_aa22_71e06f5d454f.slice/crio-9ac7ced501074f6ee9b6201a32337643b06ccb7321512c7767b063ad1abaac58 WatchSource:0}: Error finding container 9ac7ced501074f6ee9b6201a32337643b06ccb7321512c7767b063ad1abaac58: Status 404 returned error can't find the container with id 9ac7ced501074f6ee9b6201a32337643b06ccb7321512c7767b063ad1abaac58 Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.543572 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.556561 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:59:04 crc kubenswrapper[4475]: E1203 06:59:04.557174 4475 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 06:59:04 crc kubenswrapper[4475]: E1203 06:59:04.557191 4475 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 06:59:04 crc kubenswrapper[4475]: E1203 06:59:04.557220 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift podName:4b95df68-1a9d-403e-ab8f-87335fd821fe nodeName:}" failed. No retries permitted until 2025-12-03 06:59:12.557209031 +0000 UTC m=+837.362107365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift") pod "swift-storage-0" (UID: "4b95df68-1a9d-403e-ab8f-87335fd821fe") : configmap "swift-ring-files" not found Dec 03 06:59:04 crc kubenswrapper[4475]: I1203 06:59:04.983131 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-psvbz"] Dec 03 06:59:04 crc kubenswrapper[4475]: W1203 06:59:04.986406 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e136d3d_bbe5_44b0_ad48_4e560507aeac.slice/crio-9d104dfbfd7cd99ea765aa264ec557518ee58f2f83635d4b51cba4519dc73092 WatchSource:0}: Error finding container 9d104dfbfd7cd99ea765aa264ec557518ee58f2f83635d4b51cba4519dc73092: Status 404 returned error can't find the container with id 9d104dfbfd7cd99ea765aa264ec557518ee58f2f83635d4b51cba4519dc73092 Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.287401 4475 generic.go:334] "Generic (PLEG): container finished" podID="f131782f-4d8b-48fb-9eff-1a2c1f7b859b" containerID="78e434cfebb1c0da4cc0193ccaced0ca0fea26f9446299013209707b8e70b844" exitCode=0 Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.287474 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d6e-account-create-update-gtfgv" event={"ID":"f131782f-4d8b-48fb-9eff-1a2c1f7b859b","Type":"ContainerDied","Data":"78e434cfebb1c0da4cc0193ccaced0ca0fea26f9446299013209707b8e70b844"} Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.288829 4475 generic.go:334] "Generic (PLEG): container finished" podID="e76aadcc-0222-442c-8c9a-d0b197c92978" containerID="a3c176e5a040dee9ac021ab10fa5deec4f496ebd9e227bcb62c68a7ef95d6624" exitCode=0 Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.288886 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5b8fz" event={"ID":"e76aadcc-0222-442c-8c9a-d0b197c92978","Type":"ContainerDied","Data":"a3c176e5a040dee9ac021ab10fa5deec4f496ebd9e227bcb62c68a7ef95d6624"} Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.290083 4475 generic.go:334] "Generic (PLEG): container finished" podID="9276d039-20eb-4651-b987-97f393cbc59a" containerID="215c348b1f650cdac200c94f2e35071fba83509aa1b351f56ccc2e99fefc45d8" exitCode=0 Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.290119 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6dfhs" event={"ID":"9276d039-20eb-4651-b987-97f393cbc59a","Type":"ContainerDied","Data":"215c348b1f650cdac200c94f2e35071fba83509aa1b351f56ccc2e99fefc45d8"} Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.290161 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6dfhs" event={"ID":"9276d039-20eb-4651-b987-97f393cbc59a","Type":"ContainerStarted","Data":"8626344add073cd68020ca4ce784a20120f59cc699060a958810c90fecfdc181"} Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.291055 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-psvbz" event={"ID":"3e136d3d-bbe5-44b0-ad48-4e560507aeac","Type":"ContainerStarted","Data":"9d104dfbfd7cd99ea765aa264ec557518ee58f2f83635d4b51cba4519dc73092"} Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.292461 4475 generic.go:334] "Generic (PLEG): container finished" podID="1b649c00-045b-474a-aa22-71e06f5d454f" containerID="6e7a6fc7215fff9a2ce501076749a575f1b754686798d3d087892b7aa8a70f01" exitCode=0 Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.292488 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-33d5-account-create-update-h4hzd" event={"ID":"1b649c00-045b-474a-aa22-71e06f5d454f","Type":"ContainerDied","Data":"6e7a6fc7215fff9a2ce501076749a575f1b754686798d3d087892b7aa8a70f01"} Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.292507 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-33d5-account-create-update-h4hzd" event={"ID":"1b649c00-045b-474a-aa22-71e06f5d454f","Type":"ContainerStarted","Data":"9ac7ced501074f6ee9b6201a32337643b06ccb7321512c7767b063ad1abaac58"} Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.847590 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.893285 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5849c4cb99-2bndp"] Dec 03 06:59:05 crc kubenswrapper[4475]: I1203 06:59:05.893505 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" podUID="da48be59-9479-477c-a208-d623edd61159" containerName="dnsmasq-dns" containerID="cri-o://1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966" gracePeriod=10 Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.149363 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.291004 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.311771 4475 generic.go:334] "Generic (PLEG): container finished" podID="da48be59-9479-477c-a208-d623edd61159" containerID="1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966" exitCode=0 Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.311932 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.312396 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" event={"ID":"da48be59-9479-477c-a208-d623edd61159","Type":"ContainerDied","Data":"1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966"} Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.312422 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5849c4cb99-2bndp" event={"ID":"da48be59-9479-477c-a208-d623edd61159","Type":"ContainerDied","Data":"d04c4d1ac4c3c94a525f3e9e5a763892a63b40e7f9ec7b5aa8f67a44b7ee7ff6"} Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.312437 4475 scope.go:117] "RemoveContainer" containerID="1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.338781 4475 scope.go:117] "RemoveContainer" containerID="060804bad0b84efc5b1c52bc0c58a60ce6de4e2fdb49d2833ec582a384108738" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.360229 4475 scope.go:117] "RemoveContainer" containerID="1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966" Dec 03 06:59:06 crc kubenswrapper[4475]: E1203 06:59:06.366605 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966\": container with ID starting with 1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966 not found: ID does not exist" containerID="1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.366642 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966"} err="failed to get container status \"1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966\": rpc error: code = NotFound desc = could not find container \"1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966\": container with ID starting with 1f74715d797f43c5a5bd3f449aad818ecf4cc03fb14c64717837b4d4f5726966 not found: ID does not exist" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.366663 4475 scope.go:117] "RemoveContainer" containerID="060804bad0b84efc5b1c52bc0c58a60ce6de4e2fdb49d2833ec582a384108738" Dec 03 06:59:06 crc kubenswrapper[4475]: E1203 06:59:06.367597 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060804bad0b84efc5b1c52bc0c58a60ce6de4e2fdb49d2833ec582a384108738\": container with ID starting with 060804bad0b84efc5b1c52bc0c58a60ce6de4e2fdb49d2833ec582a384108738 not found: ID does not exist" containerID="060804bad0b84efc5b1c52bc0c58a60ce6de4e2fdb49d2833ec582a384108738" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.367620 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060804bad0b84efc5b1c52bc0c58a60ce6de4e2fdb49d2833ec582a384108738"} err="failed to get container status \"060804bad0b84efc5b1c52bc0c58a60ce6de4e2fdb49d2833ec582a384108738\": rpc error: code = NotFound desc = could not find container \"060804bad0b84efc5b1c52bc0c58a60ce6de4e2fdb49d2833ec582a384108738\": container with ID starting with 060804bad0b84efc5b1c52bc0c58a60ce6de4e2fdb49d2833ec582a384108738 not found: ID does not exist" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.382217 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwqxb\" (UniqueName: \"kubernetes.io/projected/da48be59-9479-477c-a208-d623edd61159-kube-api-access-gwqxb\") pod \"da48be59-9479-477c-a208-d623edd61159\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.382252 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-dns-svc\") pod \"da48be59-9479-477c-a208-d623edd61159\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.382286 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-config\") pod \"da48be59-9479-477c-a208-d623edd61159\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.382348 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-nb\") pod \"da48be59-9479-477c-a208-d623edd61159\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.382411 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-sb\") pod \"da48be59-9479-477c-a208-d623edd61159\" (UID: \"da48be59-9479-477c-a208-d623edd61159\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.403348 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da48be59-9479-477c-a208-d623edd61159-kube-api-access-gwqxb" (OuterVolumeSpecName: "kube-api-access-gwqxb") pod "da48be59-9479-477c-a208-d623edd61159" (UID: "da48be59-9479-477c-a208-d623edd61159"). InnerVolumeSpecName "kube-api-access-gwqxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.437054 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-config" (OuterVolumeSpecName: "config") pod "da48be59-9479-477c-a208-d623edd61159" (UID: "da48be59-9479-477c-a208-d623edd61159"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.439792 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da48be59-9479-477c-a208-d623edd61159" (UID: "da48be59-9479-477c-a208-d623edd61159"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.444721 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da48be59-9479-477c-a208-d623edd61159" (UID: "da48be59-9479-477c-a208-d623edd61159"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.462391 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da48be59-9479-477c-a208-d623edd61159" (UID: "da48be59-9479-477c-a208-d623edd61159"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.484067 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwqxb\" (UniqueName: \"kubernetes.io/projected/da48be59-9479-477c-a208-d623edd61159-kube-api-access-gwqxb\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.484105 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.484116 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.484124 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.484132 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da48be59-9479-477c-a208-d623edd61159-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.586802 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6dfhs" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.636615 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5849c4cb99-2bndp"] Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.643125 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5849c4cb99-2bndp"] Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.698690 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5b8fz" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.722211 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d6e-account-create-update-gtfgv" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.731624 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-33d5-account-create-update-h4hzd" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.786539 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9276d039-20eb-4651-b987-97f393cbc59a-operator-scripts\") pod \"9276d039-20eb-4651-b987-97f393cbc59a\" (UID: \"9276d039-20eb-4651-b987-97f393cbc59a\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.786573 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpwqt\" (UniqueName: \"kubernetes.io/projected/9276d039-20eb-4651-b987-97f393cbc59a-kube-api-access-kpwqt\") pod \"9276d039-20eb-4651-b987-97f393cbc59a\" (UID: \"9276d039-20eb-4651-b987-97f393cbc59a\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.786996 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9276d039-20eb-4651-b987-97f393cbc59a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9276d039-20eb-4651-b987-97f393cbc59a" (UID: "9276d039-20eb-4651-b987-97f393cbc59a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.789054 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9276d039-20eb-4651-b987-97f393cbc59a-kube-api-access-kpwqt" (OuterVolumeSpecName: "kube-api-access-kpwqt") pod "9276d039-20eb-4651-b987-97f393cbc59a" (UID: "9276d039-20eb-4651-b987-97f393cbc59a"). InnerVolumeSpecName "kube-api-access-kpwqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.887729 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b649c00-045b-474a-aa22-71e06f5d454f-operator-scripts\") pod \"1b649c00-045b-474a-aa22-71e06f5d454f\" (UID: \"1b649c00-045b-474a-aa22-71e06f5d454f\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.887765 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9gm8\" (UniqueName: \"kubernetes.io/projected/e76aadcc-0222-442c-8c9a-d0b197c92978-kube-api-access-z9gm8\") pod \"e76aadcc-0222-442c-8c9a-d0b197c92978\" (UID: \"e76aadcc-0222-442c-8c9a-d0b197c92978\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.887833 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76aadcc-0222-442c-8c9a-d0b197c92978-operator-scripts\") pod \"e76aadcc-0222-442c-8c9a-d0b197c92978\" (UID: \"e76aadcc-0222-442c-8c9a-d0b197c92978\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.887932 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrv7p\" (UniqueName: \"kubernetes.io/projected/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-kube-api-access-lrv7p\") pod \"f131782f-4d8b-48fb-9eff-1a2c1f7b859b\" (UID: \"f131782f-4d8b-48fb-9eff-1a2c1f7b859b\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.887958 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-operator-scripts\") pod \"f131782f-4d8b-48fb-9eff-1a2c1f7b859b\" (UID: \"f131782f-4d8b-48fb-9eff-1a2c1f7b859b\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.888038 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9txsk\" (UniqueName: \"kubernetes.io/projected/1b649c00-045b-474a-aa22-71e06f5d454f-kube-api-access-9txsk\") pod \"1b649c00-045b-474a-aa22-71e06f5d454f\" (UID: \"1b649c00-045b-474a-aa22-71e06f5d454f\") " Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.888373 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9276d039-20eb-4651-b987-97f393cbc59a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.888406 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpwqt\" (UniqueName: \"kubernetes.io/projected/9276d039-20eb-4651-b987-97f393cbc59a-kube-api-access-kpwqt\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.888558 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b649c00-045b-474a-aa22-71e06f5d454f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b649c00-045b-474a-aa22-71e06f5d454f" (UID: "1b649c00-045b-474a-aa22-71e06f5d454f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.889024 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e76aadcc-0222-442c-8c9a-d0b197c92978-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e76aadcc-0222-442c-8c9a-d0b197c92978" (UID: "e76aadcc-0222-442c-8c9a-d0b197c92978"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.889108 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f131782f-4d8b-48fb-9eff-1a2c1f7b859b" (UID: "f131782f-4d8b-48fb-9eff-1a2c1f7b859b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.890336 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e76aadcc-0222-442c-8c9a-d0b197c92978-kube-api-access-z9gm8" (OuterVolumeSpecName: "kube-api-access-z9gm8") pod "e76aadcc-0222-442c-8c9a-d0b197c92978" (UID: "e76aadcc-0222-442c-8c9a-d0b197c92978"). InnerVolumeSpecName "kube-api-access-z9gm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.890838 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b649c00-045b-474a-aa22-71e06f5d454f-kube-api-access-9txsk" (OuterVolumeSpecName: "kube-api-access-9txsk") pod "1b649c00-045b-474a-aa22-71e06f5d454f" (UID: "1b649c00-045b-474a-aa22-71e06f5d454f"). InnerVolumeSpecName "kube-api-access-9txsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.890966 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-kube-api-access-lrv7p" (OuterVolumeSpecName: "kube-api-access-lrv7p") pod "f131782f-4d8b-48fb-9eff-1a2c1f7b859b" (UID: "f131782f-4d8b-48fb-9eff-1a2c1f7b859b"). InnerVolumeSpecName "kube-api-access-lrv7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.989770 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9txsk\" (UniqueName: \"kubernetes.io/projected/1b649c00-045b-474a-aa22-71e06f5d454f-kube-api-access-9txsk\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.989794 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b649c00-045b-474a-aa22-71e06f5d454f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.989804 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9gm8\" (UniqueName: \"kubernetes.io/projected/e76aadcc-0222-442c-8c9a-d0b197c92978-kube-api-access-z9gm8\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.989812 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76aadcc-0222-442c-8c9a-d0b197c92978-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.989826 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrv7p\" (UniqueName: \"kubernetes.io/projected/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-kube-api-access-lrv7p\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:06 crc kubenswrapper[4475]: I1203 06:59:06.989834 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f131782f-4d8b-48fb-9eff-1a2c1f7b859b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.318919 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6dfhs" event={"ID":"9276d039-20eb-4651-b987-97f393cbc59a","Type":"ContainerDied","Data":"8626344add073cd68020ca4ce784a20120f59cc699060a958810c90fecfdc181"} Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.318952 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8626344add073cd68020ca4ce784a20120f59cc699060a958810c90fecfdc181" Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.319001 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6dfhs" Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.325746 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-33d5-account-create-update-h4hzd" event={"ID":"1b649c00-045b-474a-aa22-71e06f5d454f","Type":"ContainerDied","Data":"9ac7ced501074f6ee9b6201a32337643b06ccb7321512c7767b063ad1abaac58"} Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.325811 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ac7ced501074f6ee9b6201a32337643b06ccb7321512c7767b063ad1abaac58" Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.325861 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-33d5-account-create-update-h4hzd" Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.329933 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d6e-account-create-update-gtfgv" event={"ID":"f131782f-4d8b-48fb-9eff-1a2c1f7b859b","Type":"ContainerDied","Data":"e4bce4960e59b7008a5c2e2e7b704c5352bcaebac87e6da63dc18ad62a7eae4d"} Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.329981 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4bce4960e59b7008a5c2e2e7b704c5352bcaebac87e6da63dc18ad62a7eae4d" Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.330016 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d6e-account-create-update-gtfgv" Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.336815 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5b8fz" event={"ID":"e76aadcc-0222-442c-8c9a-d0b197c92978","Type":"ContainerDied","Data":"a691d6c78330ed7b84e15661f25e1b7f3f50a8edece09b1cf5d3ddb4ceba80d1"} Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.336856 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a691d6c78330ed7b84e15661f25e1b7f3f50a8edece09b1cf5d3ddb4ceba80d1" Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.336869 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5b8fz" Dec 03 06:59:07 crc kubenswrapper[4475]: I1203 06:59:07.498475 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da48be59-9479-477c-a208-d623edd61159" path="/var/lib/kubelet/pods/da48be59-9479-477c-a208-d623edd61159/volumes" Dec 03 06:59:08 crc kubenswrapper[4475]: I1203 06:59:08.343865 4475 generic.go:334] "Generic (PLEG): container finished" podID="084fe7c1-e14b-4a01-8f56-5b0e545b9888" containerID="5a215c949522c8310ad42d865f0c169e2f63dc357cca84cf0df7ddcd4873ffbe" exitCode=0 Dec 03 06:59:08 crc kubenswrapper[4475]: I1203 06:59:08.343942 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c2ch7" event={"ID":"084fe7c1-e14b-4a01-8f56-5b0e545b9888","Type":"ContainerDied","Data":"5a215c949522c8310ad42d865f0c169e2f63dc357cca84cf0df7ddcd4873ffbe"} Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.592737 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.630613 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-swiftconf\") pod \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.630649 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-scripts\") pod \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.630672 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-ring-data-devices\") pod \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.630692 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/084fe7c1-e14b-4a01-8f56-5b0e545b9888-etc-swift\") pod \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.630746 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-combined-ca-bundle\") pod \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.630790 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-dispersionconf\") pod \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.630830 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fpzb\" (UniqueName: \"kubernetes.io/projected/084fe7c1-e14b-4a01-8f56-5b0e545b9888-kube-api-access-5fpzb\") pod \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\" (UID: \"084fe7c1-e14b-4a01-8f56-5b0e545b9888\") " Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.631654 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "084fe7c1-e14b-4a01-8f56-5b0e545b9888" (UID: "084fe7c1-e14b-4a01-8f56-5b0e545b9888"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.632008 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/084fe7c1-e14b-4a01-8f56-5b0e545b9888-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "084fe7c1-e14b-4a01-8f56-5b0e545b9888" (UID: "084fe7c1-e14b-4a01-8f56-5b0e545b9888"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.639862 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "084fe7c1-e14b-4a01-8f56-5b0e545b9888" (UID: "084fe7c1-e14b-4a01-8f56-5b0e545b9888"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.640726 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/084fe7c1-e14b-4a01-8f56-5b0e545b9888-kube-api-access-5fpzb" (OuterVolumeSpecName: "kube-api-access-5fpzb") pod "084fe7c1-e14b-4a01-8f56-5b0e545b9888" (UID: "084fe7c1-e14b-4a01-8f56-5b0e545b9888"). InnerVolumeSpecName "kube-api-access-5fpzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.652073 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "084fe7c1-e14b-4a01-8f56-5b0e545b9888" (UID: "084fe7c1-e14b-4a01-8f56-5b0e545b9888"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.652467 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "084fe7c1-e14b-4a01-8f56-5b0e545b9888" (UID: "084fe7c1-e14b-4a01-8f56-5b0e545b9888"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.664910 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-scripts" (OuterVolumeSpecName: "scripts") pod "084fe7c1-e14b-4a01-8f56-5b0e545b9888" (UID: "084fe7c1-e14b-4a01-8f56-5b0e545b9888"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.732381 4475 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/084fe7c1-e14b-4a01-8f56-5b0e545b9888-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.732539 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.732550 4475 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.732558 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fpzb\" (UniqueName: \"kubernetes.io/projected/084fe7c1-e14b-4a01-8f56-5b0e545b9888-kube-api-access-5fpzb\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.732566 4475 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/084fe7c1-e14b-4a01-8f56-5b0e545b9888-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.732573 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:09 crc kubenswrapper[4475]: I1203 06:59:09.732580 4475 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/084fe7c1-e14b-4a01-8f56-5b0e545b9888-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:10 crc kubenswrapper[4475]: I1203 06:59:10.357706 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c2ch7" event={"ID":"084fe7c1-e14b-4a01-8f56-5b0e545b9888","Type":"ContainerDied","Data":"19e2ceea10ca96be27b9fbe95a8c77cc3c23b2581d1a53757347fd45c26e6edd"} Dec 03 06:59:10 crc kubenswrapper[4475]: I1203 06:59:10.357737 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19e2ceea10ca96be27b9fbe95a8c77cc3c23b2581d1a53757347fd45c26e6edd" Dec 03 06:59:10 crc kubenswrapper[4475]: I1203 06:59:10.357779 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c2ch7" Dec 03 06:59:12 crc kubenswrapper[4475]: I1203 06:59:12.567916 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:59:12 crc kubenswrapper[4475]: I1203 06:59:12.574543 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b95df68-1a9d-403e-ab8f-87335fd821fe-etc-swift\") pod \"swift-storage-0\" (UID: \"4b95df68-1a9d-403e-ab8f-87335fd821fe\") " pod="openstack/swift-storage-0" Dec 03 06:59:12 crc kubenswrapper[4475]: I1203 06:59:12.866792 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 06:59:13 crc kubenswrapper[4475]: I1203 06:59:13.295009 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 06:59:13 crc kubenswrapper[4475]: I1203 06:59:13.374879 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"bf8af41c8bf1a7a5b629e9b1cf6e2c3feac42b17d42f53e8956c6338ad2c1740"} Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.013416 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hcblc" podUID="462c6048-51ec-46dd-8eda-64398e53ce5b" containerName="ovn-controller" probeResult="failure" output=< Dec 03 06:59:14 crc kubenswrapper[4475]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 06:59:14 crc kubenswrapper[4475]: > Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.061526 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.062400 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-blpfh" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251324 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hcblc-config-kzj9v"] Dec 03 06:59:14 crc kubenswrapper[4475]: E1203 06:59:14.251626 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f131782f-4d8b-48fb-9eff-1a2c1f7b859b" containerName="mariadb-account-create-update" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251642 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f131782f-4d8b-48fb-9eff-1a2c1f7b859b" containerName="mariadb-account-create-update" Dec 03 06:59:14 crc kubenswrapper[4475]: E1203 06:59:14.251652 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da48be59-9479-477c-a208-d623edd61159" containerName="dnsmasq-dns" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251658 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="da48be59-9479-477c-a208-d623edd61159" containerName="dnsmasq-dns" Dec 03 06:59:14 crc kubenswrapper[4475]: E1203 06:59:14.251672 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9276d039-20eb-4651-b987-97f393cbc59a" containerName="mariadb-database-create" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251679 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="9276d039-20eb-4651-b987-97f393cbc59a" containerName="mariadb-database-create" Dec 03 06:59:14 crc kubenswrapper[4475]: E1203 06:59:14.251688 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="084fe7c1-e14b-4a01-8f56-5b0e545b9888" containerName="swift-ring-rebalance" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251694 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="084fe7c1-e14b-4a01-8f56-5b0e545b9888" containerName="swift-ring-rebalance" Dec 03 06:59:14 crc kubenswrapper[4475]: E1203 06:59:14.251713 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da48be59-9479-477c-a208-d623edd61159" containerName="init" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251719 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="da48be59-9479-477c-a208-d623edd61159" containerName="init" Dec 03 06:59:14 crc kubenswrapper[4475]: E1203 06:59:14.251725 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b649c00-045b-474a-aa22-71e06f5d454f" containerName="mariadb-account-create-update" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251731 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b649c00-045b-474a-aa22-71e06f5d454f" containerName="mariadb-account-create-update" Dec 03 06:59:14 crc kubenswrapper[4475]: E1203 06:59:14.251741 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e76aadcc-0222-442c-8c9a-d0b197c92978" containerName="mariadb-database-create" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251746 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76aadcc-0222-442c-8c9a-d0b197c92978" containerName="mariadb-database-create" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251871 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="9276d039-20eb-4651-b987-97f393cbc59a" containerName="mariadb-database-create" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251882 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f131782f-4d8b-48fb-9eff-1a2c1f7b859b" containerName="mariadb-account-create-update" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251887 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b649c00-045b-474a-aa22-71e06f5d454f" containerName="mariadb-account-create-update" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251898 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="da48be59-9479-477c-a208-d623edd61159" containerName="dnsmasq-dns" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251945 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="e76aadcc-0222-442c-8c9a-d0b197c92978" containerName="mariadb-database-create" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.251957 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="084fe7c1-e14b-4a01-8f56-5b0e545b9888" containerName="swift-ring-rebalance" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.252484 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.253784 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.260995 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hcblc-config-kzj9v"] Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.394598 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run-ovn\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.394646 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-additional-scripts\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.394671 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.394748 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-log-ovn\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.394838 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-scripts\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.394873 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmk7h\" (UniqueName: \"kubernetes.io/projected/d4b3c5a7-09c6-4730-84e7-452e5e191028-kube-api-access-nmk7h\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.495483 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-log-ovn\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.495808 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-log-ovn\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.496011 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-scripts\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.498566 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmk7h\" (UniqueName: \"kubernetes.io/projected/d4b3c5a7-09c6-4730-84e7-452e5e191028-kube-api-access-nmk7h\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.498717 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run-ovn\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.498735 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-additional-scripts\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.498758 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.498848 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.498849 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-scripts\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.498924 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run-ovn\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.499424 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-additional-scripts\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.514015 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmk7h\" (UniqueName: \"kubernetes.io/projected/d4b3c5a7-09c6-4730-84e7-452e5e191028-kube-api-access-nmk7h\") pod \"ovn-controller-hcblc-config-kzj9v\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:14 crc kubenswrapper[4475]: I1203 06:59:14.571846 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:15 crc kubenswrapper[4475]: I1203 06:59:15.075192 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hcblc-config-kzj9v"] Dec 03 06:59:15 crc kubenswrapper[4475]: W1203 06:59:15.089285 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4b3c5a7_09c6_4730_84e7_452e5e191028.slice/crio-ab164d8c4b1df15d3d4ac176cfc6d3be3d2f47df2b7f6dea3a6bdb31f668be31 WatchSource:0}: Error finding container ab164d8c4b1df15d3d4ac176cfc6d3be3d2f47df2b7f6dea3a6bdb31f668be31: Status 404 returned error can't find the container with id ab164d8c4b1df15d3d4ac176cfc6d3be3d2f47df2b7f6dea3a6bdb31f668be31 Dec 03 06:59:15 crc kubenswrapper[4475]: I1203 06:59:15.387123 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hcblc-config-kzj9v" event={"ID":"d4b3c5a7-09c6-4730-84e7-452e5e191028","Type":"ContainerStarted","Data":"36757543935b64ce2011e8269e6c8c33fcefe33820f887496e8233a2cbb685bf"} Dec 03 06:59:15 crc kubenswrapper[4475]: I1203 06:59:15.387161 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hcblc-config-kzj9v" event={"ID":"d4b3c5a7-09c6-4730-84e7-452e5e191028","Type":"ContainerStarted","Data":"ab164d8c4b1df15d3d4ac176cfc6d3be3d2f47df2b7f6dea3a6bdb31f668be31"} Dec 03 06:59:15 crc kubenswrapper[4475]: I1203 06:59:15.389058 4475 generic.go:334] "Generic (PLEG): container finished" podID="386645cd-74e5-45bc-b3e4-0a326e5349f1" containerID="ee598506c3416b4517e59d2ed9d555835bc5aaa108285dc206bf5b914fcfd214" exitCode=0 Dec 03 06:59:15 crc kubenswrapper[4475]: I1203 06:59:15.389124 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"386645cd-74e5-45bc-b3e4-0a326e5349f1","Type":"ContainerDied","Data":"ee598506c3416b4517e59d2ed9d555835bc5aaa108285dc206bf5b914fcfd214"} Dec 03 06:59:15 crc kubenswrapper[4475]: I1203 06:59:15.392084 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"ff9f910f3f7f19664677ec992333cff7b424fffe35f8ec287f62a7d909e1fc7d"} Dec 03 06:59:15 crc kubenswrapper[4475]: I1203 06:59:15.392119 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"552da58887428164816f293747a17ff5adb9c440bd3543dd88a6a6e53ac0b331"} Dec 03 06:59:15 crc kubenswrapper[4475]: I1203 06:59:15.392130 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"0ac4c37e1a253540896c856252d0a5e8f7505ef4cf7142072161fa149db83eb6"} Dec 03 06:59:15 crc kubenswrapper[4475]: I1203 06:59:15.392138 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"4b2cb5f531ed28e6fec33a67382a3104bdf73e5e1e4dac708f43dfdb56b9d34c"} Dec 03 06:59:15 crc kubenswrapper[4475]: I1203 06:59:15.398805 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hcblc-config-kzj9v" podStartSLOduration=1.398797391 podStartE2EDuration="1.398797391s" podCreationTimestamp="2025-12-03 06:59:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:59:15.396928627 +0000 UTC m=+840.201826960" watchObservedRunningTime="2025-12-03 06:59:15.398797391 +0000 UTC m=+840.203695726" Dec 03 06:59:16 crc kubenswrapper[4475]: I1203 06:59:16.399054 4475 generic.go:334] "Generic (PLEG): container finished" podID="6447be14-8b0d-4514-a7c2-53da228c70c2" containerID="814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d" exitCode=0 Dec 03 06:59:16 crc kubenswrapper[4475]: I1203 06:59:16.399119 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6447be14-8b0d-4514-a7c2-53da228c70c2","Type":"ContainerDied","Data":"814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d"} Dec 03 06:59:16 crc kubenswrapper[4475]: I1203 06:59:16.402595 4475 generic.go:334] "Generic (PLEG): container finished" podID="d4b3c5a7-09c6-4730-84e7-452e5e191028" containerID="36757543935b64ce2011e8269e6c8c33fcefe33820f887496e8233a2cbb685bf" exitCode=0 Dec 03 06:59:16 crc kubenswrapper[4475]: I1203 06:59:16.402691 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hcblc-config-kzj9v" event={"ID":"d4b3c5a7-09c6-4730-84e7-452e5e191028","Type":"ContainerDied","Data":"36757543935b64ce2011e8269e6c8c33fcefe33820f887496e8233a2cbb685bf"} Dec 03 06:59:16 crc kubenswrapper[4475]: I1203 06:59:16.406678 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"386645cd-74e5-45bc-b3e4-0a326e5349f1","Type":"ContainerStarted","Data":"ea4744fd3cea2ebbc5ee016691ec2d3fafcd209271fae422863e376aa08ff040"} Dec 03 06:59:16 crc kubenswrapper[4475]: I1203 06:59:16.407287 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 06:59:16 crc kubenswrapper[4475]: I1203 06:59:16.464580 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.640034539 podStartE2EDuration="1m17.464566971s" podCreationTimestamp="2025-12-03 06:57:59 +0000 UTC" firstStartedPulling="2025-12-03 06:58:00.894168132 +0000 UTC m=+765.699066465" lastFinishedPulling="2025-12-03 06:58:42.718700562 +0000 UTC m=+807.523598897" observedRunningTime="2025-12-03 06:59:16.462472732 +0000 UTC m=+841.267371076" watchObservedRunningTime="2025-12-03 06:59:16.464566971 +0000 UTC m=+841.269465306" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.459332 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6447be14-8b0d-4514-a7c2-53da228c70c2","Type":"ContainerStarted","Data":"96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843"} Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.460200 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.467644 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"da04f1c9b266152a63bff6f2526d7b3fe3f3e153ff19a8b4873ca75b2f19267f"} Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.467707 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"21b9ab835631c3ad9addcc1cfbaae9d315ea66e207ac3d4ebbf795988b426716"} Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.467718 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"a80ab633257b7f2631e125c26640fc333a27634b31a19904fbd4de26f71cf75d"} Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.485657 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.916631312 podStartE2EDuration="1m18.485648478s" podCreationTimestamp="2025-12-03 06:57:59 +0000 UTC" firstStartedPulling="2025-12-03 06:58:01.207320699 +0000 UTC m=+766.012219033" lastFinishedPulling="2025-12-03 06:58:42.776337865 +0000 UTC m=+807.581236199" observedRunningTime="2025-12-03 06:59:17.481773448 +0000 UTC m=+842.286671782" watchObservedRunningTime="2025-12-03 06:59:17.485648478 +0000 UTC m=+842.290546811" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.750327 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.847729 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run-ovn\") pod \"d4b3c5a7-09c6-4730-84e7-452e5e191028\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.847833 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-scripts\") pod \"d4b3c5a7-09c6-4730-84e7-452e5e191028\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.847875 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-additional-scripts\") pod \"d4b3c5a7-09c6-4730-84e7-452e5e191028\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.847898 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmk7h\" (UniqueName: \"kubernetes.io/projected/d4b3c5a7-09c6-4730-84e7-452e5e191028-kube-api-access-nmk7h\") pod \"d4b3c5a7-09c6-4730-84e7-452e5e191028\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.847924 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d4b3c5a7-09c6-4730-84e7-452e5e191028" (UID: "d4b3c5a7-09c6-4730-84e7-452e5e191028"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.847980 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-log-ovn\") pod \"d4b3c5a7-09c6-4730-84e7-452e5e191028\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.848012 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run\") pod \"d4b3c5a7-09c6-4730-84e7-452e5e191028\" (UID: \"d4b3c5a7-09c6-4730-84e7-452e5e191028\") " Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.848338 4475 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.848375 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run" (OuterVolumeSpecName: "var-run") pod "d4b3c5a7-09c6-4730-84e7-452e5e191028" (UID: "d4b3c5a7-09c6-4730-84e7-452e5e191028"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.848602 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d4b3c5a7-09c6-4730-84e7-452e5e191028" (UID: "d4b3c5a7-09c6-4730-84e7-452e5e191028"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.848803 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d4b3c5a7-09c6-4730-84e7-452e5e191028" (UID: "d4b3c5a7-09c6-4730-84e7-452e5e191028"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.848899 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-scripts" (OuterVolumeSpecName: "scripts") pod "d4b3c5a7-09c6-4730-84e7-452e5e191028" (UID: "d4b3c5a7-09c6-4730-84e7-452e5e191028"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.853298 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b3c5a7-09c6-4730-84e7-452e5e191028-kube-api-access-nmk7h" (OuterVolumeSpecName: "kube-api-access-nmk7h") pod "d4b3c5a7-09c6-4730-84e7-452e5e191028" (UID: "d4b3c5a7-09c6-4730-84e7-452e5e191028"). InnerVolumeSpecName "kube-api-access-nmk7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.949597 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.949623 4475 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b3c5a7-09c6-4730-84e7-452e5e191028-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.949640 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmk7h\" (UniqueName: \"kubernetes.io/projected/d4b3c5a7-09c6-4730-84e7-452e5e191028-kube-api-access-nmk7h\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.949649 4475 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:17 crc kubenswrapper[4475]: I1203 06:59:17.949660 4475 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4b3c5a7-09c6-4730-84e7-452e5e191028-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:18 crc kubenswrapper[4475]: I1203 06:59:18.476245 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hcblc-config-kzj9v" event={"ID":"d4b3c5a7-09c6-4730-84e7-452e5e191028","Type":"ContainerDied","Data":"ab164d8c4b1df15d3d4ac176cfc6d3be3d2f47df2b7f6dea3a6bdb31f668be31"} Dec 03 06:59:18 crc kubenswrapper[4475]: I1203 06:59:18.476279 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab164d8c4b1df15d3d4ac176cfc6d3be3d2f47df2b7f6dea3a6bdb31f668be31" Dec 03 06:59:18 crc kubenswrapper[4475]: I1203 06:59:18.476255 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hcblc-config-kzj9v" Dec 03 06:59:18 crc kubenswrapper[4475]: I1203 06:59:18.479816 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"c133d5a14dd7c3495a6c1a54a27abb87cc7c3f43cc18ce1648eee645a02eeb33"} Dec 03 06:59:18 crc kubenswrapper[4475]: I1203 06:59:18.490903 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hcblc-config-kzj9v"] Dec 03 06:59:18 crc kubenswrapper[4475]: I1203 06:59:18.494134 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hcblc-config-kzj9v"] Dec 03 06:59:19 crc kubenswrapper[4475]: I1203 06:59:19.020836 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-hcblc" Dec 03 06:59:19 crc kubenswrapper[4475]: I1203 06:59:19.499540 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b3c5a7-09c6-4730-84e7-452e5e191028" path="/var/lib/kubelet/pods/d4b3c5a7-09c6-4730-84e7-452e5e191028/volumes" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.535335 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"c7374be9398d8c5ae7a662fa269e6e3b1bc974cf34deaa3c6fac9e6a16ed2671"} Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.535798 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"e81505952aacaa5ba689b669ba4ae00249eb4b3a511e4a1a11ab517168830078"} Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.535808 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"b2639857551eaf0b4581462dca1ecc021cdee1d64973de101ec3dfa97c821a96"} Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.535817 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"b2224e157676bae5fd565397a893a7592337bdda6d31da0ad8415fecffae9f70"} Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.535825 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"8224040ce9062d1e6f7b50e1ab9012f69c75823cdf305231c46b36cdee34e996"} Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.535833 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"4560a5d866063264fc54cc9e9c35faa9b978b867ba43ba520258f6615ad3a943"} Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.535839 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b95df68-1a9d-403e-ab8f-87335fd821fe","Type":"ContainerStarted","Data":"4835b9c27cff97f77d201c751fbca7068702e2a89f6088810ff9340a91f0561d"} Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.539412 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-psvbz" event={"ID":"3e136d3d-bbe5-44b0-ad48-4e560507aeac","Type":"ContainerStarted","Data":"94fa3e8d211ebc4a7c67b8c8b7a3e9967b0c98327ed8147162979b3a8935e113"} Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.563438 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.174864751 podStartE2EDuration="31.563423683s" podCreationTimestamp="2025-12-03 06:58:55 +0000 UTC" firstStartedPulling="2025-12-03 06:59:13.302182319 +0000 UTC m=+838.107080652" lastFinishedPulling="2025-12-03 06:59:25.69074125 +0000 UTC m=+850.495639584" observedRunningTime="2025-12-03 06:59:26.558126088 +0000 UTC m=+851.363024422" watchObservedRunningTime="2025-12-03 06:59:26.563423683 +0000 UTC m=+851.368322018" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.578699 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-psvbz" podStartSLOduration=1.855990426 podStartE2EDuration="22.578688486s" podCreationTimestamp="2025-12-03 06:59:04 +0000 UTC" firstStartedPulling="2025-12-03 06:59:04.988782142 +0000 UTC m=+829.793680475" lastFinishedPulling="2025-12-03 06:59:25.711480201 +0000 UTC m=+850.516378535" observedRunningTime="2025-12-03 06:59:26.574185436 +0000 UTC m=+851.379083771" watchObservedRunningTime="2025-12-03 06:59:26.578688486 +0000 UTC m=+851.383586821" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.802582 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8cbcccb95-whk2c"] Dec 03 06:59:26 crc kubenswrapper[4475]: E1203 06:59:26.803061 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b3c5a7-09c6-4730-84e7-452e5e191028" containerName="ovn-config" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.803136 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b3c5a7-09c6-4730-84e7-452e5e191028" containerName="ovn-config" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.803351 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b3c5a7-09c6-4730-84e7-452e5e191028" containerName="ovn-config" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.804182 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.806366 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.815946 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8cbcccb95-whk2c"] Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.864632 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-svc\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.864821 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-swift-storage-0\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.864908 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-nb\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.865182 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-config\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.865323 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7cqr\" (UniqueName: \"kubernetes.io/projected/fee5b066-7e4f-4c02-a0de-0eeb152c9887-kube-api-access-q7cqr\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.865636 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-sb\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.967042 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7cqr\" (UniqueName: \"kubernetes.io/projected/fee5b066-7e4f-4c02-a0de-0eeb152c9887-kube-api-access-q7cqr\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.967143 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-sb\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.967168 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-svc\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.967188 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-nb\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.967205 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-swift-storage-0\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.967223 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-config\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.968095 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-config\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.968164 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-svc\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.968302 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-nb\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.968491 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-swift-storage-0\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:26 crc kubenswrapper[4475]: I1203 06:59:26.969004 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-sb\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:27 crc kubenswrapper[4475]: I1203 06:59:27.002772 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7cqr\" (UniqueName: \"kubernetes.io/projected/fee5b066-7e4f-4c02-a0de-0eeb152c9887-kube-api-access-q7cqr\") pod \"dnsmasq-dns-8cbcccb95-whk2c\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:27 crc kubenswrapper[4475]: I1203 06:59:27.117267 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:27 crc kubenswrapper[4475]: I1203 06:59:27.512740 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8cbcccb95-whk2c"] Dec 03 06:59:27 crc kubenswrapper[4475]: W1203 06:59:27.518839 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee5b066_7e4f_4c02_a0de_0eeb152c9887.slice/crio-39ae9cd084c02b964d0bb2ce007a9c638678767b073487e9a71ae6778829665a WatchSource:0}: Error finding container 39ae9cd084c02b964d0bb2ce007a9c638678767b073487e9a71ae6778829665a: Status 404 returned error can't find the container with id 39ae9cd084c02b964d0bb2ce007a9c638678767b073487e9a71ae6778829665a Dec 03 06:59:27 crc kubenswrapper[4475]: I1203 06:59:27.552887 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" event={"ID":"fee5b066-7e4f-4c02-a0de-0eeb152c9887","Type":"ContainerStarted","Data":"39ae9cd084c02b964d0bb2ce007a9c638678767b073487e9a71ae6778829665a"} Dec 03 06:59:28 crc kubenswrapper[4475]: I1203 06:59:28.559381 4475 generic.go:334] "Generic (PLEG): container finished" podID="fee5b066-7e4f-4c02-a0de-0eeb152c9887" containerID="5df999a1ad2c20bdaba7f94db0ce9d528245cbb73536724ec5d7ccea3dd19b0a" exitCode=0 Dec 03 06:59:28 crc kubenswrapper[4475]: I1203 06:59:28.559445 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" event={"ID":"fee5b066-7e4f-4c02-a0de-0eeb152c9887","Type":"ContainerDied","Data":"5df999a1ad2c20bdaba7f94db0ce9d528245cbb73536724ec5d7ccea3dd19b0a"} Dec 03 06:59:28 crc kubenswrapper[4475]: I1203 06:59:28.933230 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:59:28 crc kubenswrapper[4475]: I1203 06:59:28.933285 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:59:28 crc kubenswrapper[4475]: I1203 06:59:28.933328 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 06:59:28 crc kubenswrapper[4475]: I1203 06:59:28.933811 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e442459db76920abc97188abe20663d3f8869ff7e3f567458064e516a3ad52c"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:59:28 crc kubenswrapper[4475]: I1203 06:59:28.933864 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://9e442459db76920abc97188abe20663d3f8869ff7e3f567458064e516a3ad52c" gracePeriod=600 Dec 03 06:59:29 crc kubenswrapper[4475]: I1203 06:59:29.572987 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" event={"ID":"fee5b066-7e4f-4c02-a0de-0eeb152c9887","Type":"ContainerStarted","Data":"614b87f38bf22f7f1ad9f68624a1b7aed6d72b61ff96d4fc49ea40bf8367762b"} Dec 03 06:59:29 crc kubenswrapper[4475]: I1203 06:59:29.573605 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:29 crc kubenswrapper[4475]: I1203 06:59:29.575835 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="9e442459db76920abc97188abe20663d3f8869ff7e3f567458064e516a3ad52c" exitCode=0 Dec 03 06:59:29 crc kubenswrapper[4475]: I1203 06:59:29.575869 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"9e442459db76920abc97188abe20663d3f8869ff7e3f567458064e516a3ad52c"} Dec 03 06:59:29 crc kubenswrapper[4475]: I1203 06:59:29.575889 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"001eb8a40dd541fdfa62c93940e55ef947928ce582f2778a9f17df66253e35b4"} Dec 03 06:59:29 crc kubenswrapper[4475]: I1203 06:59:29.575905 4475 scope.go:117] "RemoveContainer" containerID="0a13f575406937575ec2819856647e11d9c4ccb5a9ec17bf4568eec9af01a7ba" Dec 03 06:59:29 crc kubenswrapper[4475]: I1203 06:59:29.590649 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" podStartSLOduration=3.590636681 podStartE2EDuration="3.590636681s" podCreationTimestamp="2025-12-03 06:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:59:29.589968043 +0000 UTC m=+854.394866376" watchObservedRunningTime="2025-12-03 06:59:29.590636681 +0000 UTC m=+854.395535015" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.390342 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.600182 4475 generic.go:334] "Generic (PLEG): container finished" podID="3e136d3d-bbe5-44b0-ad48-4e560507aeac" containerID="94fa3e8d211ebc4a7c67b8c8b7a3e9967b0c98327ed8147162979b3a8935e113" exitCode=0 Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.601945 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-psvbz" event={"ID":"3e136d3d-bbe5-44b0-ad48-4e560507aeac","Type":"ContainerDied","Data":"94fa3e8d211ebc4a7c67b8c8b7a3e9967b0c98327ed8147162979b3a8935e113"} Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.779963 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.814004 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lszxw"] Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.815722 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.833759 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qqg9j"] Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.835227 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qqg9j" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.836259 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7whtk\" (UniqueName: \"kubernetes.io/projected/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-kube-api-access-7whtk\") pod \"community-operators-lszxw\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.836493 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-catalog-content\") pod \"community-operators-lszxw\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.836572 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-utilities\") pod \"community-operators-lszxw\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.840939 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-1b56-account-create-update-t9h78"] Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.846436 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-1b56-account-create-update-t9h78" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.858556 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.870981 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-1b56-account-create-update-t9h78"] Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.884006 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lszxw"] Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.897710 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qqg9j"] Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.937405 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2sms\" (UniqueName: \"kubernetes.io/projected/790b2af7-c661-4e6e-9579-f338835ff45a-kube-api-access-t2sms\") pod \"cinder-db-create-qqg9j\" (UID: \"790b2af7-c661-4e6e-9579-f338835ff45a\") " pod="openstack/cinder-db-create-qqg9j" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.937468 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92b6e31-02a9-4c39-8505-d8c3a9224862-operator-scripts\") pod \"heat-1b56-account-create-update-t9h78\" (UID: \"c92b6e31-02a9-4c39-8505-d8c3a9224862\") " pod="openstack/heat-1b56-account-create-update-t9h78" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.937581 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkhb2\" (UniqueName: \"kubernetes.io/projected/c92b6e31-02a9-4c39-8505-d8c3a9224862-kube-api-access-dkhb2\") pod \"heat-1b56-account-create-update-t9h78\" (UID: \"c92b6e31-02a9-4c39-8505-d8c3a9224862\") " pod="openstack/heat-1b56-account-create-update-t9h78" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.937667 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7whtk\" (UniqueName: \"kubernetes.io/projected/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-kube-api-access-7whtk\") pod \"community-operators-lszxw\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.937785 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/790b2af7-c661-4e6e-9579-f338835ff45a-operator-scripts\") pod \"cinder-db-create-qqg9j\" (UID: \"790b2af7-c661-4e6e-9579-f338835ff45a\") " pod="openstack/cinder-db-create-qqg9j" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.937818 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-catalog-content\") pod \"community-operators-lszxw\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.937868 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-utilities\") pod \"community-operators-lszxw\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.938402 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-utilities\") pod \"community-operators-lszxw\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.938532 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-catalog-content\") pod \"community-operators-lszxw\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:30 crc kubenswrapper[4475]: I1203 06:59:30.972252 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7whtk\" (UniqueName: \"kubernetes.io/projected/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-kube-api-access-7whtk\") pod \"community-operators-lszxw\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.030559 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-vtkm6"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.031647 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vtkm6" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.034513 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vtkm6"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.042671 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krxsk\" (UniqueName: \"kubernetes.io/projected/f50cb109-7030-4a9a-9401-78f0296c1d4e-kube-api-access-krxsk\") pod \"barbican-db-create-vtkm6\" (UID: \"f50cb109-7030-4a9a-9401-78f0296c1d4e\") " pod="openstack/barbican-db-create-vtkm6" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.042719 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/790b2af7-c661-4e6e-9579-f338835ff45a-operator-scripts\") pod \"cinder-db-create-qqg9j\" (UID: \"790b2af7-c661-4e6e-9579-f338835ff45a\") " pod="openstack/cinder-db-create-qqg9j" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.042778 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2sms\" (UniqueName: \"kubernetes.io/projected/790b2af7-c661-4e6e-9579-f338835ff45a-kube-api-access-t2sms\") pod \"cinder-db-create-qqg9j\" (UID: \"790b2af7-c661-4e6e-9579-f338835ff45a\") " pod="openstack/cinder-db-create-qqg9j" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.042820 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92b6e31-02a9-4c39-8505-d8c3a9224862-operator-scripts\") pod \"heat-1b56-account-create-update-t9h78\" (UID: \"c92b6e31-02a9-4c39-8505-d8c3a9224862\") " pod="openstack/heat-1b56-account-create-update-t9h78" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.042851 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkhb2\" (UniqueName: \"kubernetes.io/projected/c92b6e31-02a9-4c39-8505-d8c3a9224862-kube-api-access-dkhb2\") pod \"heat-1b56-account-create-update-t9h78\" (UID: \"c92b6e31-02a9-4c39-8505-d8c3a9224862\") " pod="openstack/heat-1b56-account-create-update-t9h78" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.042886 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50cb109-7030-4a9a-9401-78f0296c1d4e-operator-scripts\") pod \"barbican-db-create-vtkm6\" (UID: \"f50cb109-7030-4a9a-9401-78f0296c1d4e\") " pod="openstack/barbican-db-create-vtkm6" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.043835 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/790b2af7-c661-4e6e-9579-f338835ff45a-operator-scripts\") pod \"cinder-db-create-qqg9j\" (UID: \"790b2af7-c661-4e6e-9579-f338835ff45a\") " pod="openstack/cinder-db-create-qqg9j" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.046790 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92b6e31-02a9-4c39-8505-d8c3a9224862-operator-scripts\") pod \"heat-1b56-account-create-update-t9h78\" (UID: \"c92b6e31-02a9-4c39-8505-d8c3a9224862\") " pod="openstack/heat-1b56-account-create-update-t9h78" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.071774 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkhb2\" (UniqueName: \"kubernetes.io/projected/c92b6e31-02a9-4c39-8505-d8c3a9224862-kube-api-access-dkhb2\") pod \"heat-1b56-account-create-update-t9h78\" (UID: \"c92b6e31-02a9-4c39-8505-d8c3a9224862\") " pod="openstack/heat-1b56-account-create-update-t9h78" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.071826 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2sms\" (UniqueName: \"kubernetes.io/projected/790b2af7-c661-4e6e-9579-f338835ff45a-kube-api-access-t2sms\") pod \"cinder-db-create-qqg9j\" (UID: \"790b2af7-c661-4e6e-9579-f338835ff45a\") " pod="openstack/cinder-db-create-qqg9j" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.128604 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qvcvf"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.129485 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.131078 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.138113 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ccmjt" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.138277 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.138341 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.138536 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.145088 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdcn7\" (UniqueName: \"kubernetes.io/projected/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-kube-api-access-bdcn7\") pod \"keystone-db-sync-qvcvf\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.145221 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50cb109-7030-4a9a-9401-78f0296c1d4e-operator-scripts\") pod \"barbican-db-create-vtkm6\" (UID: \"f50cb109-7030-4a9a-9401-78f0296c1d4e\") " pod="openstack/barbican-db-create-vtkm6" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.145303 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krxsk\" (UniqueName: \"kubernetes.io/projected/f50cb109-7030-4a9a-9401-78f0296c1d4e-kube-api-access-krxsk\") pod \"barbican-db-create-vtkm6\" (UID: \"f50cb109-7030-4a9a-9401-78f0296c1d4e\") " pod="openstack/barbican-db-create-vtkm6" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.145366 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-config-data\") pod \"keystone-db-sync-qvcvf\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.145415 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-combined-ca-bundle\") pod \"keystone-db-sync-qvcvf\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.147592 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qqg9j" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.150659 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50cb109-7030-4a9a-9401-78f0296c1d4e-operator-scripts\") pod \"barbican-db-create-vtkm6\" (UID: \"f50cb109-7030-4a9a-9401-78f0296c1d4e\") " pod="openstack/barbican-db-create-vtkm6" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.151147 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qvcvf"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.158645 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-1b56-account-create-update-t9h78" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.199565 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9a21-account-create-update-jxlth"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.200951 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a21-account-create-update-jxlth" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.207125 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9a21-account-create-update-jxlth"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.210717 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.217792 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krxsk\" (UniqueName: \"kubernetes.io/projected/f50cb109-7030-4a9a-9401-78f0296c1d4e-kube-api-access-krxsk\") pod \"barbican-db-create-vtkm6\" (UID: \"f50cb109-7030-4a9a-9401-78f0296c1d4e\") " pod="openstack/barbican-db-create-vtkm6" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.248914 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-combined-ca-bundle\") pod \"keystone-db-sync-qvcvf\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.267609 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdcn7\" (UniqueName: \"kubernetes.io/projected/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-kube-api-access-bdcn7\") pod \"keystone-db-sync-qvcvf\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.267978 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-config-data\") pod \"keystone-db-sync-qvcvf\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.279269 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-config-data\") pod \"keystone-db-sync-qvcvf\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.292173 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdcn7\" (UniqueName: \"kubernetes.io/projected/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-kube-api-access-bdcn7\") pod \"keystone-db-sync-qvcvf\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.305143 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-zdkhj"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.308746 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdkhj" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.330882 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-combined-ca-bundle\") pod \"keystone-db-sync-qvcvf\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.358020 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vtkm6" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.370931 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-zdkhj"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.390861 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgtwn\" (UniqueName: \"kubernetes.io/projected/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-kube-api-access-wgtwn\") pod \"heat-db-create-zdkhj\" (UID: \"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3\") " pod="openstack/heat-db-create-zdkhj" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.390930 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-operator-scripts\") pod \"heat-db-create-zdkhj\" (UID: \"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3\") " pod="openstack/heat-db-create-zdkhj" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.391196 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmw5m\" (UniqueName: \"kubernetes.io/projected/e4684c84-5da0-44d3-a47e-0cd3e2cba943-kube-api-access-vmw5m\") pod \"cinder-9a21-account-create-update-jxlth\" (UID: \"e4684c84-5da0-44d3-a47e-0cd3e2cba943\") " pod="openstack/cinder-9a21-account-create-update-jxlth" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.391242 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4684c84-5da0-44d3-a47e-0cd3e2cba943-operator-scripts\") pod \"cinder-9a21-account-create-update-jxlth\" (UID: \"e4684c84-5da0-44d3-a47e-0cd3e2cba943\") " pod="openstack/cinder-9a21-account-create-update-jxlth" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.403238 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-df9c-account-create-update-hsg55"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.404739 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-df9c-account-create-update-hsg55" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.410497 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.451760 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-df9c-account-create-update-hsg55"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.452846 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.466869 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-nkqhw"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.468201 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nkqhw" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.492228 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgtwn\" (UniqueName: \"kubernetes.io/projected/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-kube-api-access-wgtwn\") pod \"heat-db-create-zdkhj\" (UID: \"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3\") " pod="openstack/heat-db-create-zdkhj" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.492265 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-operator-scripts\") pod \"heat-db-create-zdkhj\" (UID: \"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3\") " pod="openstack/heat-db-create-zdkhj" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.492331 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvz22\" (UniqueName: \"kubernetes.io/projected/6a4c017b-d333-4421-be59-552865c2b025-kube-api-access-dvz22\") pod \"barbican-df9c-account-create-update-hsg55\" (UID: \"6a4c017b-d333-4421-be59-552865c2b025\") " pod="openstack/barbican-df9c-account-create-update-hsg55" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.492364 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2hg6\" (UniqueName: \"kubernetes.io/projected/5045b615-e537-4115-9af6-764a3969ac1b-kube-api-access-s2hg6\") pod \"neutron-db-create-nkqhw\" (UID: \"5045b615-e537-4115-9af6-764a3969ac1b\") " pod="openstack/neutron-db-create-nkqhw" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.492400 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a4c017b-d333-4421-be59-552865c2b025-operator-scripts\") pod \"barbican-df9c-account-create-update-hsg55\" (UID: \"6a4c017b-d333-4421-be59-552865c2b025\") " pod="openstack/barbican-df9c-account-create-update-hsg55" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.492425 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmw5m\" (UniqueName: \"kubernetes.io/projected/e4684c84-5da0-44d3-a47e-0cd3e2cba943-kube-api-access-vmw5m\") pod \"cinder-9a21-account-create-update-jxlth\" (UID: \"e4684c84-5da0-44d3-a47e-0cd3e2cba943\") " pod="openstack/cinder-9a21-account-create-update-jxlth" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.492724 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4684c84-5da0-44d3-a47e-0cd3e2cba943-operator-scripts\") pod \"cinder-9a21-account-create-update-jxlth\" (UID: \"e4684c84-5da0-44d3-a47e-0cd3e2cba943\") " pod="openstack/cinder-9a21-account-create-update-jxlth" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.492752 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5045b615-e537-4115-9af6-764a3969ac1b-operator-scripts\") pod \"neutron-db-create-nkqhw\" (UID: \"5045b615-e537-4115-9af6-764a3969ac1b\") " pod="openstack/neutron-db-create-nkqhw" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.493129 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-operator-scripts\") pod \"heat-db-create-zdkhj\" (UID: \"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3\") " pod="openstack/heat-db-create-zdkhj" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.493714 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4684c84-5da0-44d3-a47e-0cd3e2cba943-operator-scripts\") pod \"cinder-9a21-account-create-update-jxlth\" (UID: \"e4684c84-5da0-44d3-a47e-0cd3e2cba943\") " pod="openstack/cinder-9a21-account-create-update-jxlth" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.507076 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nkqhw"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.526866 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgtwn\" (UniqueName: \"kubernetes.io/projected/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-kube-api-access-wgtwn\") pod \"heat-db-create-zdkhj\" (UID: \"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3\") " pod="openstack/heat-db-create-zdkhj" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.533159 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmw5m\" (UniqueName: \"kubernetes.io/projected/e4684c84-5da0-44d3-a47e-0cd3e2cba943-kube-api-access-vmw5m\") pod \"cinder-9a21-account-create-update-jxlth\" (UID: \"e4684c84-5da0-44d3-a47e-0cd3e2cba943\") " pod="openstack/cinder-9a21-account-create-update-jxlth" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.551981 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a21-account-create-update-jxlth" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.603161 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvz22\" (UniqueName: \"kubernetes.io/projected/6a4c017b-d333-4421-be59-552865c2b025-kube-api-access-dvz22\") pod \"barbican-df9c-account-create-update-hsg55\" (UID: \"6a4c017b-d333-4421-be59-552865c2b025\") " pod="openstack/barbican-df9c-account-create-update-hsg55" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.603209 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2hg6\" (UniqueName: \"kubernetes.io/projected/5045b615-e537-4115-9af6-764a3969ac1b-kube-api-access-s2hg6\") pod \"neutron-db-create-nkqhw\" (UID: \"5045b615-e537-4115-9af6-764a3969ac1b\") " pod="openstack/neutron-db-create-nkqhw" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.603246 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a4c017b-d333-4421-be59-552865c2b025-operator-scripts\") pod \"barbican-df9c-account-create-update-hsg55\" (UID: \"6a4c017b-d333-4421-be59-552865c2b025\") " pod="openstack/barbican-df9c-account-create-update-hsg55" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.603299 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5045b615-e537-4115-9af6-764a3969ac1b-operator-scripts\") pod \"neutron-db-create-nkqhw\" (UID: \"5045b615-e537-4115-9af6-764a3969ac1b\") " pod="openstack/neutron-db-create-nkqhw" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.605173 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a4c017b-d333-4421-be59-552865c2b025-operator-scripts\") pod \"barbican-df9c-account-create-update-hsg55\" (UID: \"6a4c017b-d333-4421-be59-552865c2b025\") " pod="openstack/barbican-df9c-account-create-update-hsg55" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.605683 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5045b615-e537-4115-9af6-764a3969ac1b-operator-scripts\") pod \"neutron-db-create-nkqhw\" (UID: \"5045b615-e537-4115-9af6-764a3969ac1b\") " pod="openstack/neutron-db-create-nkqhw" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.636250 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2hg6\" (UniqueName: \"kubernetes.io/projected/5045b615-e537-4115-9af6-764a3969ac1b-kube-api-access-s2hg6\") pod \"neutron-db-create-nkqhw\" (UID: \"5045b615-e537-4115-9af6-764a3969ac1b\") " pod="openstack/neutron-db-create-nkqhw" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.639237 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvz22\" (UniqueName: \"kubernetes.io/projected/6a4c017b-d333-4421-be59-552865c2b025-kube-api-access-dvz22\") pod \"barbican-df9c-account-create-update-hsg55\" (UID: \"6a4c017b-d333-4421-be59-552865c2b025\") " pod="openstack/barbican-df9c-account-create-update-hsg55" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.644651 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdkhj" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.726551 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-df9c-account-create-update-hsg55" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.786340 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c2f3-account-create-update-r8pw9"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.787687 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c2f3-account-create-update-r8pw9" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.800287 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nkqhw" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.801058 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.817048 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c2f3-account-create-update-r8pw9"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.915847 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5915d05-922b-4df4-b6da-beadb7537e57-operator-scripts\") pod \"neutron-c2f3-account-create-update-r8pw9\" (UID: \"f5915d05-922b-4df4-b6da-beadb7537e57\") " pod="openstack/neutron-c2f3-account-create-update-r8pw9" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.916255 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h7gk\" (UniqueName: \"kubernetes.io/projected/f5915d05-922b-4df4-b6da-beadb7537e57-kube-api-access-7h7gk\") pod \"neutron-c2f3-account-create-update-r8pw9\" (UID: \"f5915d05-922b-4df4-b6da-beadb7537e57\") " pod="openstack/neutron-c2f3-account-create-update-r8pw9" Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.936668 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lszxw"] Dec 03 06:59:31 crc kubenswrapper[4475]: I1203 06:59:31.987600 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qqg9j"] Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.018236 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5915d05-922b-4df4-b6da-beadb7537e57-operator-scripts\") pod \"neutron-c2f3-account-create-update-r8pw9\" (UID: \"f5915d05-922b-4df4-b6da-beadb7537e57\") " pod="openstack/neutron-c2f3-account-create-update-r8pw9" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.018633 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h7gk\" (UniqueName: \"kubernetes.io/projected/f5915d05-922b-4df4-b6da-beadb7537e57-kube-api-access-7h7gk\") pod \"neutron-c2f3-account-create-update-r8pw9\" (UID: \"f5915d05-922b-4df4-b6da-beadb7537e57\") " pod="openstack/neutron-c2f3-account-create-update-r8pw9" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.019534 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5915d05-922b-4df4-b6da-beadb7537e57-operator-scripts\") pod \"neutron-c2f3-account-create-update-r8pw9\" (UID: \"f5915d05-922b-4df4-b6da-beadb7537e57\") " pod="openstack/neutron-c2f3-account-create-update-r8pw9" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.044441 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h7gk\" (UniqueName: \"kubernetes.io/projected/f5915d05-922b-4df4-b6da-beadb7537e57-kube-api-access-7h7gk\") pod \"neutron-c2f3-account-create-update-r8pw9\" (UID: \"f5915d05-922b-4df4-b6da-beadb7537e57\") " pod="openstack/neutron-c2f3-account-create-update-r8pw9" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.142896 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c2f3-account-create-update-r8pw9" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.230429 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-1b56-account-create-update-t9h78"] Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.402442 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qvcvf"] Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.441065 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vtkm6"] Dec 03 06:59:32 crc kubenswrapper[4475]: W1203 06:59:32.463235 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ab43dfd_1c80_4922_aab7_93dc3d3b7d27.slice/crio-9a29f6deda89c9ec1e835a0fc75287e37c91ad6e3ad2b8a0607d20c9f594efa8 WatchSource:0}: Error finding container 9a29f6deda89c9ec1e835a0fc75287e37c91ad6e3ad2b8a0607d20c9f594efa8: Status 404 returned error can't find the container with id 9a29f6deda89c9ec1e835a0fc75287e37c91ad6e3ad2b8a0607d20c9f594efa8 Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.600883 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nkqhw"] Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.649545 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-df9c-account-create-update-hsg55"] Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.650030 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9a21-account-create-update-jxlth"] Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.657747 4475 generic.go:334] "Generic (PLEG): container finished" podID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" containerID="34df3fb27c46f2845cdaaeea17f1dc63255e3f2787e7e2c80255093fa4845705" exitCode=0 Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.658405 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lszxw" event={"ID":"d3affe8e-b674-46c4-bf4b-b6fc0d092df7","Type":"ContainerDied","Data":"34df3fb27c46f2845cdaaeea17f1dc63255e3f2787e7e2c80255093fa4845705"} Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.658434 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lszxw" event={"ID":"d3affe8e-b674-46c4-bf4b-b6fc0d092df7","Type":"ContainerStarted","Data":"40e0b4f81f7a967345f4cdcf4e811aae953376b0c647ce65fd815da6846490c8"} Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.675266 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.678742 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qqg9j" event={"ID":"790b2af7-c661-4e6e-9579-f338835ff45a","Type":"ContainerStarted","Data":"aa5215452a088711ead5b20bfab48d5e3e864635f2bcb584e50859ca1223fe0a"} Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.691419 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qvcvf" event={"ID":"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27","Type":"ContainerStarted","Data":"9a29f6deda89c9ec1e835a0fc75287e37c91ad6e3ad2b8a0607d20c9f594efa8"} Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.702827 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-zdkhj"] Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.709652 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-1b56-account-create-update-t9h78" event={"ID":"c92b6e31-02a9-4c39-8505-d8c3a9224862","Type":"ContainerStarted","Data":"c654899e46c04c451cbd1a4f04a1d68ce1ab69ffceef0a0ed72e1fda7edf324f"} Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.732630 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-psvbz" event={"ID":"3e136d3d-bbe5-44b0-ad48-4e560507aeac","Type":"ContainerDied","Data":"9d104dfbfd7cd99ea765aa264ec557518ee58f2f83635d4b51cba4519dc73092"} Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.732661 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d104dfbfd7cd99ea765aa264ec557518ee58f2f83635d4b51cba4519dc73092" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.732720 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-psvbz" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.744483 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-db-sync-config-data\") pod \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.744602 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jrtc\" (UniqueName: \"kubernetes.io/projected/3e136d3d-bbe5-44b0-ad48-4e560507aeac-kube-api-access-5jrtc\") pod \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.744708 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-combined-ca-bundle\") pod \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.744748 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-config-data\") pod \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\" (UID: \"3e136d3d-bbe5-44b0-ad48-4e560507aeac\") " Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.758663 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vtkm6" event={"ID":"f50cb109-7030-4a9a-9401-78f0296c1d4e","Type":"ContainerStarted","Data":"352b4208da89e3b9638b0f43328bd78868bcc14e11b33b726e93c5595f0186b1"} Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.760499 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3e136d3d-bbe5-44b0-ad48-4e560507aeac" (UID: "3e136d3d-bbe5-44b0-ad48-4e560507aeac"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.773026 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e136d3d-bbe5-44b0-ad48-4e560507aeac-kube-api-access-5jrtc" (OuterVolumeSpecName: "kube-api-access-5jrtc") pod "3e136d3d-bbe5-44b0-ad48-4e560507aeac" (UID: "3e136d3d-bbe5-44b0-ad48-4e560507aeac"). InnerVolumeSpecName "kube-api-access-5jrtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.815988 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c2f3-account-create-update-r8pw9"] Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.826235 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e136d3d-bbe5-44b0-ad48-4e560507aeac" (UID: "3e136d3d-bbe5-44b0-ad48-4e560507aeac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.846714 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.846745 4475 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.846776 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jrtc\" (UniqueName: \"kubernetes.io/projected/3e136d3d-bbe5-44b0-ad48-4e560507aeac-kube-api-access-5jrtc\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:32 crc kubenswrapper[4475]: I1203 06:59:32.975599 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-config-data" (OuterVolumeSpecName: "config-data") pod "3e136d3d-bbe5-44b0-ad48-4e560507aeac" (UID: "3e136d3d-bbe5-44b0-ad48-4e560507aeac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.056586 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e136d3d-bbe5-44b0-ad48-4e560507aeac-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.768693 4475 generic.go:334] "Generic (PLEG): container finished" podID="6a4c017b-d333-4421-be59-552865c2b025" containerID="ae3b4b724cb3f2d7bb3629dc55c6a6fe81ddcd07954065b00cf111595fe2b905" exitCode=0 Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.769659 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-df9c-account-create-update-hsg55" event={"ID":"6a4c017b-d333-4421-be59-552865c2b025","Type":"ContainerDied","Data":"ae3b4b724cb3f2d7bb3629dc55c6a6fe81ddcd07954065b00cf111595fe2b905"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.769693 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-df9c-account-create-update-hsg55" event={"ID":"6a4c017b-d333-4421-be59-552865c2b025","Type":"ContainerStarted","Data":"d4e3da973fe8a1dec8c5ef1f596648c80aa008389ad4f9d7d92ea0a4bcec3664"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.771198 4475 generic.go:334] "Generic (PLEG): container finished" podID="f5915d05-922b-4df4-b6da-beadb7537e57" containerID="4710f6a47201f53eb06128e76c9a9e68c7031c7d594239172a8e1090b4ec8184" exitCode=0 Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.771243 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c2f3-account-create-update-r8pw9" event={"ID":"f5915d05-922b-4df4-b6da-beadb7537e57","Type":"ContainerDied","Data":"4710f6a47201f53eb06128e76c9a9e68c7031c7d594239172a8e1090b4ec8184"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.771292 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c2f3-account-create-update-r8pw9" event={"ID":"f5915d05-922b-4df4-b6da-beadb7537e57","Type":"ContainerStarted","Data":"b7ebc6c84a69e698fd7c576b5b284416e4f78a0b0ded4e69a2afc97c8eb595ec"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.772361 4475 generic.go:334] "Generic (PLEG): container finished" podID="c92b6e31-02a9-4c39-8505-d8c3a9224862" containerID="65e7abf5c94ef3ae2ade2a98b65eb05e19378e09ecd86c0233a0cc46629fb33f" exitCode=0 Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.772663 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-1b56-account-create-update-t9h78" event={"ID":"c92b6e31-02a9-4c39-8505-d8c3a9224862","Type":"ContainerDied","Data":"65e7abf5c94ef3ae2ade2a98b65eb05e19378e09ecd86c0233a0cc46629fb33f"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.774567 4475 generic.go:334] "Generic (PLEG): container finished" podID="e4684c84-5da0-44d3-a47e-0cd3e2cba943" containerID="6e4bffe4be48739788a9819685203738ee99fe18002f69aefa2e25639ad8edf3" exitCode=0 Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.774628 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a21-account-create-update-jxlth" event={"ID":"e4684c84-5da0-44d3-a47e-0cd3e2cba943","Type":"ContainerDied","Data":"6e4bffe4be48739788a9819685203738ee99fe18002f69aefa2e25639ad8edf3"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.774648 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a21-account-create-update-jxlth" event={"ID":"e4684c84-5da0-44d3-a47e-0cd3e2cba943","Type":"ContainerStarted","Data":"9b159a8633cb47f2a430d5ade096b425a37efcae66b7bf0a53a13fbfaf13a2be"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.776122 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lszxw" event={"ID":"d3affe8e-b674-46c4-bf4b-b6fc0d092df7","Type":"ContainerStarted","Data":"6786e4d0920435fefcbc526e8d9e7a22ec41877e153ff4fcc74af9d9809a1cdc"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.783220 4475 generic.go:334] "Generic (PLEG): container finished" podID="790b2af7-c661-4e6e-9579-f338835ff45a" containerID="90100f7a1425b499192d4527a35a5d3de7bd153303b05fb114235c18870b6a98" exitCode=0 Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.783256 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qqg9j" event={"ID":"790b2af7-c661-4e6e-9579-f338835ff45a","Type":"ContainerDied","Data":"90100f7a1425b499192d4527a35a5d3de7bd153303b05fb114235c18870b6a98"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.784555 4475 generic.go:334] "Generic (PLEG): container finished" podID="5f4dac41-4d1c-4ad1-a5ee-802ce88143d3" containerID="bb62d6c562cddec704fa5803e771661ac36eadf00ca764823d7d39fc6b47aa5e" exitCode=0 Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.784597 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zdkhj" event={"ID":"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3","Type":"ContainerDied","Data":"bb62d6c562cddec704fa5803e771661ac36eadf00ca764823d7d39fc6b47aa5e"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.784631 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zdkhj" event={"ID":"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3","Type":"ContainerStarted","Data":"26305cc5ce0c845faf831eb6dc146460a9f3ccfe88e0bb143d914d905cfe1aee"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.786205 4475 generic.go:334] "Generic (PLEG): container finished" podID="5045b615-e537-4115-9af6-764a3969ac1b" containerID="409ac52d52ca48240fcbb42875bd144f05939dd9a77cc52eee561d5b28656b00" exitCode=0 Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.786246 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nkqhw" event={"ID":"5045b615-e537-4115-9af6-764a3969ac1b","Type":"ContainerDied","Data":"409ac52d52ca48240fcbb42875bd144f05939dd9a77cc52eee561d5b28656b00"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.786281 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nkqhw" event={"ID":"5045b615-e537-4115-9af6-764a3969ac1b","Type":"ContainerStarted","Data":"6d2e9604bcddc6de3656723b6e64afa6e26949897ad11cd31658e3c2866cb9cb"} Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.790589 4475 generic.go:334] "Generic (PLEG): container finished" podID="f50cb109-7030-4a9a-9401-78f0296c1d4e" containerID="69b72a6166c127d3fabe2dd5a580369f0ef9a9aae7532f8873500d08d4b089b0" exitCode=0 Dec 03 06:59:33 crc kubenswrapper[4475]: I1203 06:59:33.790620 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vtkm6" event={"ID":"f50cb109-7030-4a9a-9401-78f0296c1d4e","Type":"ContainerDied","Data":"69b72a6166c127d3fabe2dd5a580369f0ef9a9aae7532f8873500d08d4b089b0"} Dec 03 06:59:34 crc kubenswrapper[4475]: E1203 06:59:34.114505 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3affe8e_b674_46c4_bf4b_b6fc0d092df7.slice/crio-conmon-6786e4d0920435fefcbc526e8d9e7a22ec41877e153ff4fcc74af9d9809a1cdc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3affe8e_b674_46c4_bf4b_b6fc0d092df7.slice/crio-6786e4d0920435fefcbc526e8d9e7a22ec41877e153ff4fcc74af9d9809a1cdc.scope\": RecentStats: unable to find data in memory cache]" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.423729 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8cbcccb95-whk2c"] Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.424043 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" podUID="fee5b066-7e4f-4c02-a0de-0eeb152c9887" containerName="dnsmasq-dns" containerID="cri-o://614b87f38bf22f7f1ad9f68624a1b7aed6d72b61ff96d4fc49ea40bf8367762b" gracePeriod=10 Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.425662 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.467495 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f67cb5455-r956l"] Dec 03 06:59:34 crc kubenswrapper[4475]: E1203 06:59:34.467819 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e136d3d-bbe5-44b0-ad48-4e560507aeac" containerName="glance-db-sync" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.467836 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e136d3d-bbe5-44b0-ad48-4e560507aeac" containerName="glance-db-sync" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.468019 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e136d3d-bbe5-44b0-ad48-4e560507aeac" containerName="glance-db-sync" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.468771 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.506722 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f67cb5455-r956l"] Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.591584 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2m89\" (UniqueName: \"kubernetes.io/projected/20f60917-1d39-46b7-a37a-d33efdb6de0f-kube-api-access-m2m89\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.591650 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-sb\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.591674 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-swift-storage-0\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.591703 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-config\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.591758 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-svc\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.591791 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-nb\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.693441 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-swift-storage-0\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.693529 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-config\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.693616 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-svc\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.693677 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-nb\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.693755 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2m89\" (UniqueName: \"kubernetes.io/projected/20f60917-1d39-46b7-a37a-d33efdb6de0f-kube-api-access-m2m89\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.693799 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-sb\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.694754 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-swift-storage-0\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.694785 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-config\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.695144 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-svc\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.695161 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-nb\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.696793 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-sb\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.716673 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2m89\" (UniqueName: \"kubernetes.io/projected/20f60917-1d39-46b7-a37a-d33efdb6de0f-kube-api-access-m2m89\") pod \"dnsmasq-dns-5f67cb5455-r956l\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.799533 4475 generic.go:334] "Generic (PLEG): container finished" podID="fee5b066-7e4f-4c02-a0de-0eeb152c9887" containerID="614b87f38bf22f7f1ad9f68624a1b7aed6d72b61ff96d4fc49ea40bf8367762b" exitCode=0 Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.799636 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" event={"ID":"fee5b066-7e4f-4c02-a0de-0eeb152c9887","Type":"ContainerDied","Data":"614b87f38bf22f7f1ad9f68624a1b7aed6d72b61ff96d4fc49ea40bf8367762b"} Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.802189 4475 generic.go:334] "Generic (PLEG): container finished" podID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" containerID="6786e4d0920435fefcbc526e8d9e7a22ec41877e153ff4fcc74af9d9809a1cdc" exitCode=0 Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.802285 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lszxw" event={"ID":"d3affe8e-b674-46c4-bf4b-b6fc0d092df7","Type":"ContainerDied","Data":"6786e4d0920435fefcbc526e8d9e7a22ec41877e153ff4fcc74af9d9809a1cdc"} Dec 03 06:59:34 crc kubenswrapper[4475]: I1203 06:59:34.824315 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.002833 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.102510 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-swift-storage-0\") pod \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.102649 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-sb\") pod \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.102722 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-svc\") pod \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.102754 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7cqr\" (UniqueName: \"kubernetes.io/projected/fee5b066-7e4f-4c02-a0de-0eeb152c9887-kube-api-access-q7cqr\") pod \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.102772 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-config\") pod \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.102817 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-nb\") pod \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\" (UID: \"fee5b066-7e4f-4c02-a0de-0eeb152c9887\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.116094 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee5b066-7e4f-4c02-a0de-0eeb152c9887-kube-api-access-q7cqr" (OuterVolumeSpecName: "kube-api-access-q7cqr") pod "fee5b066-7e4f-4c02-a0de-0eeb152c9887" (UID: "fee5b066-7e4f-4c02-a0de-0eeb152c9887"). InnerVolumeSpecName "kube-api-access-q7cqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.142737 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fee5b066-7e4f-4c02-a0de-0eeb152c9887" (UID: "fee5b066-7e4f-4c02-a0de-0eeb152c9887"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.154162 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fee5b066-7e4f-4c02-a0de-0eeb152c9887" (UID: "fee5b066-7e4f-4c02-a0de-0eeb152c9887"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.159252 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fee5b066-7e4f-4c02-a0de-0eeb152c9887" (UID: "fee5b066-7e4f-4c02-a0de-0eeb152c9887"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.163793 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-config" (OuterVolumeSpecName: "config") pod "fee5b066-7e4f-4c02-a0de-0eeb152c9887" (UID: "fee5b066-7e4f-4c02-a0de-0eeb152c9887"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.205225 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7cqr\" (UniqueName: \"kubernetes.io/projected/fee5b066-7e4f-4c02-a0de-0eeb152c9887-kube-api-access-q7cqr\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.205254 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.205263 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.205272 4475 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.205280 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.209296 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fee5b066-7e4f-4c02-a0de-0eeb152c9887" (UID: "fee5b066-7e4f-4c02-a0de-0eeb152c9887"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.307509 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee5b066-7e4f-4c02-a0de-0eeb152c9887-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.386675 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qqg9j" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.394217 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdkhj" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.408292 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/790b2af7-c661-4e6e-9579-f338835ff45a-operator-scripts\") pod \"790b2af7-c661-4e6e-9579-f338835ff45a\" (UID: \"790b2af7-c661-4e6e-9579-f338835ff45a\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.408419 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-operator-scripts\") pod \"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3\" (UID: \"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.408497 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2sms\" (UniqueName: \"kubernetes.io/projected/790b2af7-c661-4e6e-9579-f338835ff45a-kube-api-access-t2sms\") pod \"790b2af7-c661-4e6e-9579-f338835ff45a\" (UID: \"790b2af7-c661-4e6e-9579-f338835ff45a\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.408544 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgtwn\" (UniqueName: \"kubernetes.io/projected/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-kube-api-access-wgtwn\") pod \"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3\" (UID: \"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.409377 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f4dac41-4d1c-4ad1-a5ee-802ce88143d3" (UID: "5f4dac41-4d1c-4ad1-a5ee-802ce88143d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.409662 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/790b2af7-c661-4e6e-9579-f338835ff45a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "790b2af7-c661-4e6e-9579-f338835ff45a" (UID: "790b2af7-c661-4e6e-9579-f338835ff45a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.421069 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-kube-api-access-wgtwn" (OuterVolumeSpecName: "kube-api-access-wgtwn") pod "5f4dac41-4d1c-4ad1-a5ee-802ce88143d3" (UID: "5f4dac41-4d1c-4ad1-a5ee-802ce88143d3"). InnerVolumeSpecName "kube-api-access-wgtwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.422824 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790b2af7-c661-4e6e-9579-f338835ff45a-kube-api-access-t2sms" (OuterVolumeSpecName: "kube-api-access-t2sms") pod "790b2af7-c661-4e6e-9579-f338835ff45a" (UID: "790b2af7-c661-4e6e-9579-f338835ff45a"). InnerVolumeSpecName "kube-api-access-t2sms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.512222 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/790b2af7-c661-4e6e-9579-f338835ff45a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.512248 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.512259 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2sms\" (UniqueName: \"kubernetes.io/projected/790b2af7-c661-4e6e-9579-f338835ff45a-kube-api-access-t2sms\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.512368 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgtwn\" (UniqueName: \"kubernetes.io/projected/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3-kube-api-access-wgtwn\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.812880 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qqg9j" event={"ID":"790b2af7-c661-4e6e-9579-f338835ff45a","Type":"ContainerDied","Data":"aa5215452a088711ead5b20bfab48d5e3e864635f2bcb584e50859ca1223fe0a"} Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.813145 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa5215452a088711ead5b20bfab48d5e3e864635f2bcb584e50859ca1223fe0a" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.813092 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qqg9j" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.814916 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-1b56-account-create-update-t9h78" event={"ID":"c92b6e31-02a9-4c39-8505-d8c3a9224862","Type":"ContainerDied","Data":"c654899e46c04c451cbd1a4f04a1d68ce1ab69ffceef0a0ed72e1fda7edf324f"} Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.814949 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c654899e46c04c451cbd1a4f04a1d68ce1ab69ffceef0a0ed72e1fda7edf324f" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.821311 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c2f3-account-create-update-r8pw9" event={"ID":"f5915d05-922b-4df4-b6da-beadb7537e57","Type":"ContainerDied","Data":"b7ebc6c84a69e698fd7c576b5b284416e4f78a0b0ded4e69a2afc97c8eb595ec"} Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.821361 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7ebc6c84a69e698fd7c576b5b284416e4f78a0b0ded4e69a2afc97c8eb595ec" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.828267 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zdkhj" event={"ID":"5f4dac41-4d1c-4ad1-a5ee-802ce88143d3","Type":"ContainerDied","Data":"26305cc5ce0c845faf831eb6dc146460a9f3ccfe88e0bb143d914d905cfe1aee"} Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.828294 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26305cc5ce0c845faf831eb6dc146460a9f3ccfe88e0bb143d914d905cfe1aee" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.828320 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdkhj" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.830061 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nkqhw" event={"ID":"5045b615-e537-4115-9af6-764a3969ac1b","Type":"ContainerDied","Data":"6d2e9604bcddc6de3656723b6e64afa6e26949897ad11cd31658e3c2866cb9cb"} Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.830084 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d2e9604bcddc6de3656723b6e64afa6e26949897ad11cd31658e3c2866cb9cb" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.831133 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a21-account-create-update-jxlth" event={"ID":"e4684c84-5da0-44d3-a47e-0cd3e2cba943","Type":"ContainerDied","Data":"9b159a8633cb47f2a430d5ade096b425a37efcae66b7bf0a53a13fbfaf13a2be"} Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.831224 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b159a8633cb47f2a430d5ade096b425a37efcae66b7bf0a53a13fbfaf13a2be" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.833207 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-df9c-account-create-update-hsg55" event={"ID":"6a4c017b-d333-4421-be59-552865c2b025","Type":"ContainerDied","Data":"d4e3da973fe8a1dec8c5ef1f596648c80aa008389ad4f9d7d92ea0a4bcec3664"} Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.833232 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4e3da973fe8a1dec8c5ef1f596648c80aa008389ad4f9d7d92ea0a4bcec3664" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.834116 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vtkm6" event={"ID":"f50cb109-7030-4a9a-9401-78f0296c1d4e","Type":"ContainerDied","Data":"352b4208da89e3b9638b0f43328bd78868bcc14e11b33b726e93c5595f0186b1"} Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.834139 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="352b4208da89e3b9638b0f43328bd78868bcc14e11b33b726e93c5595f0186b1" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.842622 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.842684 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cbcccb95-whk2c" event={"ID":"fee5b066-7e4f-4c02-a0de-0eeb152c9887","Type":"ContainerDied","Data":"39ae9cd084c02b964d0bb2ce007a9c638678767b073487e9a71ae6778829665a"} Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.842744 4475 scope.go:117] "RemoveContainer" containerID="614b87f38bf22f7f1ad9f68624a1b7aed6d72b61ff96d4fc49ea40bf8367762b" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.843288 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vtkm6" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.852397 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-1b56-account-create-update-t9h78" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.887029 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nkqhw" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.891387 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-df9c-account-create-update-hsg55" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.902023 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a21-account-create-update-jxlth" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.905692 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c2f3-account-create-update-r8pw9" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.913145 4475 scope.go:117] "RemoveContainer" containerID="5df999a1ad2c20bdaba7f94db0ce9d528245cbb73536724ec5d7ccea3dd19b0a" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.928968 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4684c84-5da0-44d3-a47e-0cd3e2cba943-operator-scripts\") pod \"e4684c84-5da0-44d3-a47e-0cd3e2cba943\" (UID: \"e4684c84-5da0-44d3-a47e-0cd3e2cba943\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.930094 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4684c84-5da0-44d3-a47e-0cd3e2cba943-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e4684c84-5da0-44d3-a47e-0cd3e2cba943" (UID: "e4684c84-5da0-44d3-a47e-0cd3e2cba943"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.932485 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h7gk\" (UniqueName: \"kubernetes.io/projected/f5915d05-922b-4df4-b6da-beadb7537e57-kube-api-access-7h7gk\") pod \"f5915d05-922b-4df4-b6da-beadb7537e57\" (UID: \"f5915d05-922b-4df4-b6da-beadb7537e57\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.932513 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8cbcccb95-whk2c"] Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.933863 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a4c017b-d333-4421-be59-552865c2b025-operator-scripts\") pod \"6a4c017b-d333-4421-be59-552865c2b025\" (UID: \"6a4c017b-d333-4421-be59-552865c2b025\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.933914 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krxsk\" (UniqueName: \"kubernetes.io/projected/f50cb109-7030-4a9a-9401-78f0296c1d4e-kube-api-access-krxsk\") pod \"f50cb109-7030-4a9a-9401-78f0296c1d4e\" (UID: \"f50cb109-7030-4a9a-9401-78f0296c1d4e\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.933986 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5915d05-922b-4df4-b6da-beadb7537e57-operator-scripts\") pod \"f5915d05-922b-4df4-b6da-beadb7537e57\" (UID: \"f5915d05-922b-4df4-b6da-beadb7537e57\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.934009 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50cb109-7030-4a9a-9401-78f0296c1d4e-operator-scripts\") pod \"f50cb109-7030-4a9a-9401-78f0296c1d4e\" (UID: \"f50cb109-7030-4a9a-9401-78f0296c1d4e\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.934029 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkhb2\" (UniqueName: \"kubernetes.io/projected/c92b6e31-02a9-4c39-8505-d8c3a9224862-kube-api-access-dkhb2\") pod \"c92b6e31-02a9-4c39-8505-d8c3a9224862\" (UID: \"c92b6e31-02a9-4c39-8505-d8c3a9224862\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.934045 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmw5m\" (UniqueName: \"kubernetes.io/projected/e4684c84-5da0-44d3-a47e-0cd3e2cba943-kube-api-access-vmw5m\") pod \"e4684c84-5da0-44d3-a47e-0cd3e2cba943\" (UID: \"e4684c84-5da0-44d3-a47e-0cd3e2cba943\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.934120 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvz22\" (UniqueName: \"kubernetes.io/projected/6a4c017b-d333-4421-be59-552865c2b025-kube-api-access-dvz22\") pod \"6a4c017b-d333-4421-be59-552865c2b025\" (UID: \"6a4c017b-d333-4421-be59-552865c2b025\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.934340 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2hg6\" (UniqueName: \"kubernetes.io/projected/5045b615-e537-4115-9af6-764a3969ac1b-kube-api-access-s2hg6\") pod \"5045b615-e537-4115-9af6-764a3969ac1b\" (UID: \"5045b615-e537-4115-9af6-764a3969ac1b\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.934353 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a4c017b-d333-4421-be59-552865c2b025-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a4c017b-d333-4421-be59-552865c2b025" (UID: "6a4c017b-d333-4421-be59-552865c2b025"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.934385 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92b6e31-02a9-4c39-8505-d8c3a9224862-operator-scripts\") pod \"c92b6e31-02a9-4c39-8505-d8c3a9224862\" (UID: \"c92b6e31-02a9-4c39-8505-d8c3a9224862\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.934406 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5045b615-e537-4115-9af6-764a3969ac1b-operator-scripts\") pod \"5045b615-e537-4115-9af6-764a3969ac1b\" (UID: \"5045b615-e537-4115-9af6-764a3969ac1b\") " Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.934750 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f50cb109-7030-4a9a-9401-78f0296c1d4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f50cb109-7030-4a9a-9401-78f0296c1d4e" (UID: "f50cb109-7030-4a9a-9401-78f0296c1d4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.935159 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4684c84-5da0-44d3-a47e-0cd3e2cba943-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.935184 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a4c017b-d333-4421-be59-552865c2b025-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.935194 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50cb109-7030-4a9a-9401-78f0296c1d4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.937046 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5915d05-922b-4df4-b6da-beadb7537e57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5915d05-922b-4df4-b6da-beadb7537e57" (UID: "f5915d05-922b-4df4-b6da-beadb7537e57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.937727 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c92b6e31-02a9-4c39-8505-d8c3a9224862-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c92b6e31-02a9-4c39-8505-d8c3a9224862" (UID: "c92b6e31-02a9-4c39-8505-d8c3a9224862"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.940159 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5915d05-922b-4df4-b6da-beadb7537e57-kube-api-access-7h7gk" (OuterVolumeSpecName: "kube-api-access-7h7gk") pod "f5915d05-922b-4df4-b6da-beadb7537e57" (UID: "f5915d05-922b-4df4-b6da-beadb7537e57"). InnerVolumeSpecName "kube-api-access-7h7gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.942229 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5045b615-e537-4115-9af6-764a3969ac1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5045b615-e537-4115-9af6-764a3969ac1b" (UID: "5045b615-e537-4115-9af6-764a3969ac1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.946801 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92b6e31-02a9-4c39-8505-d8c3a9224862-kube-api-access-dkhb2" (OuterVolumeSpecName: "kube-api-access-dkhb2") pod "c92b6e31-02a9-4c39-8505-d8c3a9224862" (UID: "c92b6e31-02a9-4c39-8505-d8c3a9224862"). InnerVolumeSpecName "kube-api-access-dkhb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.947657 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a4c017b-d333-4421-be59-552865c2b025-kube-api-access-dvz22" (OuterVolumeSpecName: "kube-api-access-dvz22") pod "6a4c017b-d333-4421-be59-552865c2b025" (UID: "6a4c017b-d333-4421-be59-552865c2b025"). InnerVolumeSpecName "kube-api-access-dvz22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.948250 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f50cb109-7030-4a9a-9401-78f0296c1d4e-kube-api-access-krxsk" (OuterVolumeSpecName: "kube-api-access-krxsk") pod "f50cb109-7030-4a9a-9401-78f0296c1d4e" (UID: "f50cb109-7030-4a9a-9401-78f0296c1d4e"). InnerVolumeSpecName "kube-api-access-krxsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.949026 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5045b615-e537-4115-9af6-764a3969ac1b-kube-api-access-s2hg6" (OuterVolumeSpecName: "kube-api-access-s2hg6") pod "5045b615-e537-4115-9af6-764a3969ac1b" (UID: "5045b615-e537-4115-9af6-764a3969ac1b"). InnerVolumeSpecName "kube-api-access-s2hg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.958043 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8cbcccb95-whk2c"] Dec 03 06:59:35 crc kubenswrapper[4475]: I1203 06:59:35.967696 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4684c84-5da0-44d3-a47e-0cd3e2cba943-kube-api-access-vmw5m" (OuterVolumeSpecName: "kube-api-access-vmw5m") pod "e4684c84-5da0-44d3-a47e-0cd3e2cba943" (UID: "e4684c84-5da0-44d3-a47e-0cd3e2cba943"). InnerVolumeSpecName "kube-api-access-vmw5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.028544 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f67cb5455-r956l"] Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.038919 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2hg6\" (UniqueName: \"kubernetes.io/projected/5045b615-e537-4115-9af6-764a3969ac1b-kube-api-access-s2hg6\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.039083 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92b6e31-02a9-4c39-8505-d8c3a9224862-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.039164 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5045b615-e537-4115-9af6-764a3969ac1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.039239 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h7gk\" (UniqueName: \"kubernetes.io/projected/f5915d05-922b-4df4-b6da-beadb7537e57-kube-api-access-7h7gk\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.039356 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krxsk\" (UniqueName: \"kubernetes.io/projected/f50cb109-7030-4a9a-9401-78f0296c1d4e-kube-api-access-krxsk\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.039418 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5915d05-922b-4df4-b6da-beadb7537e57-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.039515 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkhb2\" (UniqueName: \"kubernetes.io/projected/c92b6e31-02a9-4c39-8505-d8c3a9224862-kube-api-access-dkhb2\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.039572 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmw5m\" (UniqueName: \"kubernetes.io/projected/e4684c84-5da0-44d3-a47e-0cd3e2cba943-kube-api-access-vmw5m\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.039625 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvz22\" (UniqueName: \"kubernetes.io/projected/6a4c017b-d333-4421-be59-552865c2b025-kube-api-access-dvz22\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.851990 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lszxw" event={"ID":"d3affe8e-b674-46c4-bf4b-b6fc0d092df7","Type":"ContainerStarted","Data":"4affa86553330d26a24207ded8fde03413664328301fb8a08827e1b303791ada"} Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.859415 4475 generic.go:334] "Generic (PLEG): container finished" podID="20f60917-1d39-46b7-a37a-d33efdb6de0f" containerID="b6e37450f59ce53b45e922826254dd6bc43013922a71ebd01b069cb90c22ed28" exitCode=0 Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.859507 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" event={"ID":"20f60917-1d39-46b7-a37a-d33efdb6de0f","Type":"ContainerDied","Data":"b6e37450f59ce53b45e922826254dd6bc43013922a71ebd01b069cb90c22ed28"} Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.859534 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" event={"ID":"20f60917-1d39-46b7-a37a-d33efdb6de0f","Type":"ContainerStarted","Data":"b4bf54c0262e46240a1ab05a5b4efa5bbe6700791ba2b88e04c08fe099aa4fad"} Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.863363 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c2f3-account-create-update-r8pw9" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.863407 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a21-account-create-update-jxlth" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.863486 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vtkm6" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.863374 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-df9c-account-create-update-hsg55" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.863539 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nkqhw" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.863570 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-1b56-account-create-update-t9h78" Dec 03 06:59:36 crc kubenswrapper[4475]: I1203 06:59:36.895151 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lszxw" podStartSLOduration=4.091001609 podStartE2EDuration="6.895114908s" podCreationTimestamp="2025-12-03 06:59:30 +0000 UTC" firstStartedPulling="2025-12-03 06:59:32.669676927 +0000 UTC m=+857.474575261" lastFinishedPulling="2025-12-03 06:59:35.473790226 +0000 UTC m=+860.278688560" observedRunningTime="2025-12-03 06:59:36.876846184 +0000 UTC m=+861.681744538" watchObservedRunningTime="2025-12-03 06:59:36.895114908 +0000 UTC m=+861.700013232" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.391287 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rn44z"] Dec 03 06:59:37 crc kubenswrapper[4475]: E1203 06:59:37.391803 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee5b066-7e4f-4c02-a0de-0eeb152c9887" containerName="init" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.391824 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee5b066-7e4f-4c02-a0de-0eeb152c9887" containerName="init" Dec 03 06:59:37 crc kubenswrapper[4475]: E1203 06:59:37.391836 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee5b066-7e4f-4c02-a0de-0eeb152c9887" containerName="dnsmasq-dns" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.391843 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee5b066-7e4f-4c02-a0de-0eeb152c9887" containerName="dnsmasq-dns" Dec 03 06:59:37 crc kubenswrapper[4475]: E1203 06:59:37.391852 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50cb109-7030-4a9a-9401-78f0296c1d4e" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.391858 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50cb109-7030-4a9a-9401-78f0296c1d4e" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: E1203 06:59:37.391883 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790b2af7-c661-4e6e-9579-f338835ff45a" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.391889 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="790b2af7-c661-4e6e-9579-f338835ff45a" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: E1203 06:59:37.391895 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c92b6e31-02a9-4c39-8505-d8c3a9224862" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.391901 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92b6e31-02a9-4c39-8505-d8c3a9224862" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: E1203 06:59:37.391914 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5045b615-e537-4115-9af6-764a3969ac1b" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.391922 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="5045b615-e537-4115-9af6-764a3969ac1b" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: E1203 06:59:37.391940 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4c017b-d333-4421-be59-552865c2b025" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.391945 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4c017b-d333-4421-be59-552865c2b025" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: E1203 06:59:37.391960 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5915d05-922b-4df4-b6da-beadb7537e57" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.391965 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5915d05-922b-4df4-b6da-beadb7537e57" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: E1203 06:59:37.391973 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4684c84-5da0-44d3-a47e-0cd3e2cba943" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.391978 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4684c84-5da0-44d3-a47e-0cd3e2cba943" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: E1203 06:59:37.391999 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f4dac41-4d1c-4ad1-a5ee-802ce88143d3" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.392004 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f4dac41-4d1c-4ad1-a5ee-802ce88143d3" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.392173 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="790b2af7-c661-4e6e-9579-f338835ff45a" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.392188 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f50cb109-7030-4a9a-9401-78f0296c1d4e" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.392197 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="5045b615-e537-4115-9af6-764a3969ac1b" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.392206 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5915d05-922b-4df4-b6da-beadb7537e57" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.392218 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4c017b-d333-4421-be59-552865c2b025" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.392230 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee5b066-7e4f-4c02-a0de-0eeb152c9887" containerName="dnsmasq-dns" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.392236 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="c92b6e31-02a9-4c39-8505-d8c3a9224862" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.392247 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f4dac41-4d1c-4ad1-a5ee-802ce88143d3" containerName="mariadb-database-create" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.392256 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4684c84-5da0-44d3-a47e-0cd3e2cba943" containerName="mariadb-account-create-update" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.393595 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.400362 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rn44z"] Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.476021 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msmhp\" (UniqueName: \"kubernetes.io/projected/7a25a115-16d3-44f7-82b9-5c47efbd3e47-kube-api-access-msmhp\") pod \"redhat-marketplace-rn44z\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.476237 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-utilities\") pod \"redhat-marketplace-rn44z\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.476331 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-catalog-content\") pod \"redhat-marketplace-rn44z\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.500921 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee5b066-7e4f-4c02-a0de-0eeb152c9887" path="/var/lib/kubelet/pods/fee5b066-7e4f-4c02-a0de-0eeb152c9887/volumes" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.579306 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-catalog-content\") pod \"redhat-marketplace-rn44z\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.579446 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msmhp\" (UniqueName: \"kubernetes.io/projected/7a25a115-16d3-44f7-82b9-5c47efbd3e47-kube-api-access-msmhp\") pod \"redhat-marketplace-rn44z\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.579575 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-utilities\") pod \"redhat-marketplace-rn44z\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.579815 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-catalog-content\") pod \"redhat-marketplace-rn44z\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.580012 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-utilities\") pod \"redhat-marketplace-rn44z\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.600174 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msmhp\" (UniqueName: \"kubernetes.io/projected/7a25a115-16d3-44f7-82b9-5c47efbd3e47-kube-api-access-msmhp\") pod \"redhat-marketplace-rn44z\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:37 crc kubenswrapper[4475]: I1203 06:59:37.706966 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:40 crc kubenswrapper[4475]: I1203 06:59:40.259375 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rn44z"] Dec 03 06:59:40 crc kubenswrapper[4475]: W1203 06:59:40.265855 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a25a115_16d3_44f7_82b9_5c47efbd3e47.slice/crio-d974d3cf1659358e6b700fc82454e77f35fa678cc2917f5c45dd601060838758 WatchSource:0}: Error finding container d974d3cf1659358e6b700fc82454e77f35fa678cc2917f5c45dd601060838758: Status 404 returned error can't find the container with id d974d3cf1659358e6b700fc82454e77f35fa678cc2917f5c45dd601060838758 Dec 03 06:59:40 crc kubenswrapper[4475]: I1203 06:59:40.899395 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" event={"ID":"20f60917-1d39-46b7-a37a-d33efdb6de0f","Type":"ContainerStarted","Data":"74bc9df23fc5e94aa8f857357de9f3077a787e0df089f57f48d7d360e8b4f910"} Dec 03 06:59:40 crc kubenswrapper[4475]: I1203 06:59:40.899712 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:40 crc kubenswrapper[4475]: I1203 06:59:40.902238 4475 generic.go:334] "Generic (PLEG): container finished" podID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" containerID="137279b8a133d080a65a22bbad9e4876361da777397b27fedec669e879cfff9c" exitCode=0 Dec 03 06:59:40 crc kubenswrapper[4475]: I1203 06:59:40.902307 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn44z" event={"ID":"7a25a115-16d3-44f7-82b9-5c47efbd3e47","Type":"ContainerDied","Data":"137279b8a133d080a65a22bbad9e4876361da777397b27fedec669e879cfff9c"} Dec 03 06:59:40 crc kubenswrapper[4475]: I1203 06:59:40.902331 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn44z" event={"ID":"7a25a115-16d3-44f7-82b9-5c47efbd3e47","Type":"ContainerStarted","Data":"d974d3cf1659358e6b700fc82454e77f35fa678cc2917f5c45dd601060838758"} Dec 03 06:59:40 crc kubenswrapper[4475]: I1203 06:59:40.904113 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qvcvf" event={"ID":"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27","Type":"ContainerStarted","Data":"9a183ceea336a168b469e2f45b5e8914344ea1fd13ea594f0f0b576d57f8457c"} Dec 03 06:59:40 crc kubenswrapper[4475]: I1203 06:59:40.919604 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" podStartSLOduration=6.919591201 podStartE2EDuration="6.919591201s" podCreationTimestamp="2025-12-03 06:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:59:40.914106103 +0000 UTC m=+865.719004437" watchObservedRunningTime="2025-12-03 06:59:40.919591201 +0000 UTC m=+865.724489536" Dec 03 06:59:40 crc kubenswrapper[4475]: I1203 06:59:40.947982 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qvcvf" podStartSLOduration=2.509161895 podStartE2EDuration="9.947967578s" podCreationTimestamp="2025-12-03 06:59:31 +0000 UTC" firstStartedPulling="2025-12-03 06:59:32.473084109 +0000 UTC m=+857.277982432" lastFinishedPulling="2025-12-03 06:59:39.911889782 +0000 UTC m=+864.716788115" observedRunningTime="2025-12-03 06:59:40.943692235 +0000 UTC m=+865.748590570" watchObservedRunningTime="2025-12-03 06:59:40.947967578 +0000 UTC m=+865.752865902" Dec 03 06:59:41 crc kubenswrapper[4475]: I1203 06:59:41.132097 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:41 crc kubenswrapper[4475]: I1203 06:59:41.132432 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:41 crc kubenswrapper[4475]: I1203 06:59:41.170163 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:41 crc kubenswrapper[4475]: I1203 06:59:41.911356 4475 generic.go:334] "Generic (PLEG): container finished" podID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" containerID="dd6c849ce555df78d9aa7a776b4cc4c95185b9649b2b470d06b9c416d0fbf1c0" exitCode=0 Dec 03 06:59:41 crc kubenswrapper[4475]: I1203 06:59:41.911550 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn44z" event={"ID":"7a25a115-16d3-44f7-82b9-5c47efbd3e47","Type":"ContainerDied","Data":"dd6c849ce555df78d9aa7a776b4cc4c95185b9649b2b470d06b9c416d0fbf1c0"} Dec 03 06:59:41 crc kubenswrapper[4475]: I1203 06:59:41.953944 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:42 crc kubenswrapper[4475]: I1203 06:59:42.922415 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn44z" event={"ID":"7a25a115-16d3-44f7-82b9-5c47efbd3e47","Type":"ContainerStarted","Data":"a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e"} Dec 03 06:59:42 crc kubenswrapper[4475]: I1203 06:59:42.927056 4475 generic.go:334] "Generic (PLEG): container finished" podID="6ab43dfd-1c80-4922-aab7-93dc3d3b7d27" containerID="9a183ceea336a168b469e2f45b5e8914344ea1fd13ea594f0f0b576d57f8457c" exitCode=0 Dec 03 06:59:42 crc kubenswrapper[4475]: I1203 06:59:42.927661 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qvcvf" event={"ID":"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27","Type":"ContainerDied","Data":"9a183ceea336a168b469e2f45b5e8914344ea1fd13ea594f0f0b576d57f8457c"} Dec 03 06:59:42 crc kubenswrapper[4475]: I1203 06:59:42.943192 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rn44z" podStartSLOduration=4.463758634 podStartE2EDuration="5.943177794s" podCreationTimestamp="2025-12-03 06:59:37 +0000 UTC" firstStartedPulling="2025-12-03 06:59:40.905218445 +0000 UTC m=+865.710116779" lastFinishedPulling="2025-12-03 06:59:42.384637606 +0000 UTC m=+867.189535939" observedRunningTime="2025-12-03 06:59:42.940409867 +0000 UTC m=+867.745308201" watchObservedRunningTime="2025-12-03 06:59:42.943177794 +0000 UTC m=+867.748076128" Dec 03 06:59:43 crc kubenswrapper[4475]: I1203 06:59:43.569142 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lszxw"] Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.208067 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.292834 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdcn7\" (UniqueName: \"kubernetes.io/projected/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-kube-api-access-bdcn7\") pod \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.293003 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-config-data\") pod \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.293177 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-combined-ca-bundle\") pod \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\" (UID: \"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27\") " Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.312530 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-kube-api-access-bdcn7" (OuterVolumeSpecName: "kube-api-access-bdcn7") pod "6ab43dfd-1c80-4922-aab7-93dc3d3b7d27" (UID: "6ab43dfd-1c80-4922-aab7-93dc3d3b7d27"). InnerVolumeSpecName "kube-api-access-bdcn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.316383 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ab43dfd-1c80-4922-aab7-93dc3d3b7d27" (UID: "6ab43dfd-1c80-4922-aab7-93dc3d3b7d27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.334655 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-config-data" (OuterVolumeSpecName: "config-data") pod "6ab43dfd-1c80-4922-aab7-93dc3d3b7d27" (UID: "6ab43dfd-1c80-4922-aab7-93dc3d3b7d27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.394941 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.394965 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.394977 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdcn7\" (UniqueName: \"kubernetes.io/projected/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27-kube-api-access-bdcn7\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.949613 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qvcvf" Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.949589 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qvcvf" event={"ID":"6ab43dfd-1c80-4922-aab7-93dc3d3b7d27","Type":"ContainerDied","Data":"9a29f6deda89c9ec1e835a0fc75287e37c91ad6e3ad2b8a0607d20c9f594efa8"} Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.950564 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a29f6deda89c9ec1e835a0fc75287e37c91ad6e3ad2b8a0607d20c9f594efa8" Dec 03 06:59:44 crc kubenswrapper[4475]: I1203 06:59:44.949757 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lszxw" podUID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" containerName="registry-server" containerID="cri-o://4affa86553330d26a24207ded8fde03413664328301fb8a08827e1b303791ada" gracePeriod=2 Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.222203 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f67cb5455-r956l"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.222763 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" podUID="20f60917-1d39-46b7-a37a-d33efdb6de0f" containerName="dnsmasq-dns" containerID="cri-o://74bc9df23fc5e94aa8f857357de9f3077a787e0df089f57f48d7d360e8b4f910" gracePeriod=10 Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.224323 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.252781 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc7f9874c-qp4nh"] Dec 03 06:59:45 crc kubenswrapper[4475]: E1203 06:59:45.253202 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab43dfd-1c80-4922-aab7-93dc3d3b7d27" containerName="keystone-db-sync" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.253221 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab43dfd-1c80-4922-aab7-93dc3d3b7d27" containerName="keystone-db-sync" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.253371 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab43dfd-1c80-4922-aab7-93dc3d3b7d27" containerName="keystone-db-sync" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.254149 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.294953 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc7f9874c-qp4nh"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.310849 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-swift-storage-0\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.310896 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-config\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.310940 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-svc\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.310965 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-sb\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.311002 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-nb\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.311040 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9nxg\" (UniqueName: \"kubernetes.io/projected/45a837ca-f13a-4df1-bcad-62d1ce76225d-kube-api-access-t9nxg\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.313412 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qrzmb"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.314398 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.316834 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.316988 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.321199 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ccmjt" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.321898 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.322512 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.352484 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qrzmb"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412154 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-credential-keys\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412199 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-svc\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412229 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-sb\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412265 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-config-data\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412296 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-nb\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412331 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9nxg\" (UniqueName: \"kubernetes.io/projected/45a837ca-f13a-4df1-bcad-62d1ce76225d-kube-api-access-t9nxg\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412348 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-combined-ca-bundle\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412398 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-fernet-keys\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412425 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xddj\" (UniqueName: \"kubernetes.io/projected/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-kube-api-access-6xddj\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412444 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-scripts\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412481 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-swift-storage-0\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.412502 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-config\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.413608 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-svc\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.413640 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-sb\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.414000 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-config\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.413998 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-nb\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.414187 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-swift-storage-0\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.452724 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-bxx54"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.471324 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bxx54" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.476877 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-ld5mr" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.477183 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.515493 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9nxg\" (UniqueName: \"kubernetes.io/projected/45a837ca-f13a-4df1-bcad-62d1ce76225d-kube-api-access-t9nxg\") pod \"dnsmasq-dns-bc7f9874c-qp4nh\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.520276 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-credential-keys\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.521659 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-config-data\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.521779 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dktz8\" (UniqueName: \"kubernetes.io/projected/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-kube-api-access-dktz8\") pod \"heat-db-sync-bxx54\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " pod="openstack/heat-db-sync-bxx54" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.521860 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-combined-ca-bundle\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.521930 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-config-data\") pod \"heat-db-sync-bxx54\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " pod="openstack/heat-db-sync-bxx54" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.522023 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-combined-ca-bundle\") pod \"heat-db-sync-bxx54\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " pod="openstack/heat-db-sync-bxx54" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.522104 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-fernet-keys\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.522171 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xddj\" (UniqueName: \"kubernetes.io/projected/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-kube-api-access-6xddj\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.522236 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-scripts\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.532865 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-config-data\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.545865 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-scripts\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.550094 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-fernet-keys\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.560269 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-credential-keys\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.597927 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-combined-ca-bundle\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.599509 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-bxx54"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.606918 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xddj\" (UniqueName: \"kubernetes.io/projected/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-kube-api-access-6xddj\") pod \"keystone-bootstrap-qrzmb\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.628053 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dktz8\" (UniqueName: \"kubernetes.io/projected/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-kube-api-access-dktz8\") pod \"heat-db-sync-bxx54\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " pod="openstack/heat-db-sync-bxx54" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.628166 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-config-data\") pod \"heat-db-sync-bxx54\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " pod="openstack/heat-db-sync-bxx54" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.628245 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-combined-ca-bundle\") pod \"heat-db-sync-bxx54\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " pod="openstack/heat-db-sync-bxx54" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.635568 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.637299 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-combined-ca-bundle\") pod \"heat-db-sync-bxx54\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " pod="openstack/heat-db-sync-bxx54" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.638373 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-config-data\") pod \"heat-db-sync-bxx54\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " pod="openstack/heat-db-sync-bxx54" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.667278 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dktz8\" (UniqueName: \"kubernetes.io/projected/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-kube-api-access-dktz8\") pod \"heat-db-sync-bxx54\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " pod="openstack/heat-db-sync-bxx54" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.681202 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qrzmb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.684821 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77ccb6ff57-mkx4c"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.699623 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.711206 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77ccb6ff57-mkx4c"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.717306 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.719465 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.796980 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bxx54" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.802983 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6ntz5"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.804036 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.805735 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-kskz8" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.824681 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.828911 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.831408 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-logs\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.831882 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2w6g\" (UniqueName: \"kubernetes.io/projected/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-kube-api-access-v2w6g\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.831967 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-log-httpd\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.832028 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-scripts\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.832182 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn5xv\" (UniqueName: \"kubernetes.io/projected/12f47969-3169-43bc-8d07-cbd3952d81cf-kube-api-access-kn5xv\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.832280 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-scripts\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.832410 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-config-data\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.832499 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.832608 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-horizon-secret-key\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.832675 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.832739 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-run-httpd\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.832802 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-config-data\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.828970 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.848861 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.849316 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.849398 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ppndb" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.849603 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.865854 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.866057 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.875031 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6ntz5"] Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.937492 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn5xv\" (UniqueName: \"kubernetes.io/projected/12f47969-3169-43bc-8d07-cbd3952d81cf-kube-api-access-kn5xv\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.937859 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-combined-ca-bundle\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.937895 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-scripts\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.937917 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbqln\" (UniqueName: \"kubernetes.io/projected/2a298c73-a9bf-496a-9192-dcbf3e2417cd-kube-api-access-hbqln\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.937993 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-config-data\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938020 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938044 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a298c73-a9bf-496a-9192-dcbf3e2417cd-etc-machine-id\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938060 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-db-sync-config-data\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938081 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-config-data\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938101 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-scripts\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938173 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-horizon-secret-key\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938205 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938237 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-run-httpd\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938263 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-config-data\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938289 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-logs\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938316 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2w6g\" (UniqueName: \"kubernetes.io/projected/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-kube-api-access-v2w6g\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938332 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-log-httpd\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.938345 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-scripts\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.939057 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-scripts\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.942067 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-config-data\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.950922 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-run-httpd\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.951616 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-log-httpd\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.951701 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-logs\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.953965 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-horizon-secret-key\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.956570 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.957493 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-config-data\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.980922 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-scripts\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:45 crc kubenswrapper[4475]: I1203 06:59:45.986886 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.000804 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn5xv\" (UniqueName: \"kubernetes.io/projected/12f47969-3169-43bc-8d07-cbd3952d81cf-kube-api-access-kn5xv\") pod \"ceilometer-0\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " pod="openstack/ceilometer-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.030542 4475 generic.go:334] "Generic (PLEG): container finished" podID="20f60917-1d39-46b7-a37a-d33efdb6de0f" containerID="74bc9df23fc5e94aa8f857357de9f3077a787e0df089f57f48d7d360e8b4f910" exitCode=0 Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.030634 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" event={"ID":"20f60917-1d39-46b7-a37a-d33efdb6de0f","Type":"ContainerDied","Data":"74bc9df23fc5e94aa8f857357de9f3077a787e0df089f57f48d7d360e8b4f910"} Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.033803 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-dd8b5f859-x5jv6"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.035044 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.047572 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a298c73-a9bf-496a-9192-dcbf3e2417cd-etc-machine-id\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.047674 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-db-sync-config-data\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.047742 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-config-data\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.047808 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-scripts\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.048093 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-combined-ca-bundle\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.048173 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbqln\" (UniqueName: \"kubernetes.io/projected/2a298c73-a9bf-496a-9192-dcbf3e2417cd-kube-api-access-hbqln\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.048554 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a298c73-a9bf-496a-9192-dcbf3e2417cd-etc-machine-id\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.060039 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-db-sync-config-data\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.064976 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-scripts\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.065522 4475 generic.go:334] "Generic (PLEG): container finished" podID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" containerID="4affa86553330d26a24207ded8fde03413664328301fb8a08827e1b303791ada" exitCode=0 Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.065663 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lszxw" event={"ID":"d3affe8e-b674-46c4-bf4b-b6fc0d092df7","Type":"ContainerDied","Data":"4affa86553330d26a24207ded8fde03413664328301fb8a08827e1b303791ada"} Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.066278 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2w6g\" (UniqueName: \"kubernetes.io/projected/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-kube-api-access-v2w6g\") pod \"horizon-77ccb6ff57-mkx4c\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.066318 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-config-data\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.069983 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-combined-ca-bundle\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.073141 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.144967 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbqln\" (UniqueName: \"kubernetes.io/projected/2a298c73-a9bf-496a-9192-dcbf3e2417cd-kube-api-access-hbqln\") pod \"cinder-db-sync-6ntz5\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.150585 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d172b2a3-c6bd-424e-8f33-25a45263d546-logs\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.150718 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-config-data\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.150828 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d172b2a3-c6bd-424e-8f33-25a45263d546-horizon-secret-key\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.150897 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvc89\" (UniqueName: \"kubernetes.io/projected/d172b2a3-c6bd-424e-8f33-25a45263d546-kube-api-access-bvc89\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.151079 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-scripts\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.218942 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6ntz5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.243012 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dd8b5f859-x5jv6"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.252326 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvc89\" (UniqueName: \"kubernetes.io/projected/d172b2a3-c6bd-424e-8f33-25a45263d546-kube-api-access-bvc89\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.252541 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-scripts\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.253291 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-scripts\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.253359 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d172b2a3-c6bd-424e-8f33-25a45263d546-logs\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.253410 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-config-data\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.253491 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d172b2a3-c6bd-424e-8f33-25a45263d546-horizon-secret-key\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.255009 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d172b2a3-c6bd-424e-8f33-25a45263d546-logs\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.255914 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-config-data\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.266856 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-pqss2"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.267892 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.278083 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d172b2a3-c6bd-424e-8f33-25a45263d546-horizon-secret-key\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.278443 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.278620 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.278725 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nwnxp" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.297236 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-zch8h"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.299092 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zch8h" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.319149 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pqss2"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.319952 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvc89\" (UniqueName: \"kubernetes.io/projected/d172b2a3-c6bd-424e-8f33-25a45263d546-kube-api-access-bvc89\") pod \"horizon-dd8b5f859-x5jv6\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.329682 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.343743 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gghvg" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.343950 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.355422 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-combined-ca-bundle\") pod \"barbican-db-sync-zch8h\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " pod="openstack/barbican-db-sync-zch8h" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.355510 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-combined-ca-bundle\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.355561 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lggs4\" (UniqueName: \"kubernetes.io/projected/50149f3f-08c3-4fd9-9590-b13fcd787897-kube-api-access-lggs4\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.355588 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l67b7\" (UniqueName: \"kubernetes.io/projected/8c7df369-c49c-4d2a-842a-a8bd41944f1b-kube-api-access-l67b7\") pod \"barbican-db-sync-zch8h\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " pod="openstack/barbican-db-sync-zch8h" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.355613 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-scripts\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.355628 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50149f3f-08c3-4fd9-9590-b13fcd787897-logs\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.355682 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-config-data\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.355702 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-db-sync-config-data\") pod \"barbican-db-sync-zch8h\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " pod="openstack/barbican-db-sync-zch8h" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.366024 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc7f9874c-qp4nh"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.368704 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-hk2ck"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.369691 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hk2ck" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.375150 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.375371 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9fx2m" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.377589 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zch8h"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.375790 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.404569 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.404981 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57dbb9b85f-9wjp5"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.406180 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.416015 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hk2ck"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.425512 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.451013 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.459513 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs7h6\" (UniqueName: \"kubernetes.io/projected/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-kube-api-access-hs7h6\") pod \"neutron-db-sync-hk2ck\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " pod="openstack/neutron-db-sync-hk2ck" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.459570 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lggs4\" (UniqueName: \"kubernetes.io/projected/50149f3f-08c3-4fd9-9590-b13fcd787897-kube-api-access-lggs4\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.459608 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-config\") pod \"neutron-db-sync-hk2ck\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " pod="openstack/neutron-db-sync-hk2ck" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.459629 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l67b7\" (UniqueName: \"kubernetes.io/projected/8c7df369-c49c-4d2a-842a-a8bd41944f1b-kube-api-access-l67b7\") pod \"barbican-db-sync-zch8h\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " pod="openstack/barbican-db-sync-zch8h" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.459666 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-scripts\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.459680 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50149f3f-08c3-4fd9-9590-b13fcd787897-logs\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.459729 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-combined-ca-bundle\") pod \"neutron-db-sync-hk2ck\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " pod="openstack/neutron-db-sync-hk2ck" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.459771 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-config-data\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.459799 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-db-sync-config-data\") pod \"barbican-db-sync-zch8h\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " pod="openstack/barbican-db-sync-zch8h" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.459853 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-combined-ca-bundle\") pod \"barbican-db-sync-zch8h\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " pod="openstack/barbican-db-sync-zch8h" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.459915 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-combined-ca-bundle\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.462385 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50149f3f-08c3-4fd9-9590-b13fcd787897-logs\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.463499 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.468759 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.468962 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.469158 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wkbtp" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.474506 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57dbb9b85f-9wjp5"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.475105 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-config-data\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.475230 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-db-sync-config-data\") pod \"barbican-db-sync-zch8h\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " pod="openstack/barbican-db-sync-zch8h" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.475229 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-combined-ca-bundle\") pod \"barbican-db-sync-zch8h\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " pod="openstack/barbican-db-sync-zch8h" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.476419 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-combined-ca-bundle\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.481307 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-scripts\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.572074 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5gz2\" (UniqueName: \"kubernetes.io/projected/65d0b806-aeac-4b82-9792-32e5b25e3c3e-kube-api-access-d5gz2\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.574514 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lggs4\" (UniqueName: \"kubernetes.io/projected/50149f3f-08c3-4fd9-9590-b13fcd787897-kube-api-access-lggs4\") pod \"placement-db-sync-pqss2\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.594574 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l67b7\" (UniqueName: \"kubernetes.io/projected/8c7df369-c49c-4d2a-842a-a8bd41944f1b-kube-api-access-l67b7\") pod \"barbican-db-sync-zch8h\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " pod="openstack/barbican-db-sync-zch8h" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.617352 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.638157 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-svc\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.638199 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-sb\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.638282 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-swift-storage-0\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.638361 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs7h6\" (UniqueName: \"kubernetes.io/projected/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-kube-api-access-hs7h6\") pod \"neutron-db-sync-hk2ck\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " pod="openstack/neutron-db-sync-hk2ck" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.638426 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-config\") pod \"neutron-db-sync-hk2ck\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " pod="openstack/neutron-db-sync-hk2ck" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.638506 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-nb\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.638576 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-combined-ca-bundle\") pod \"neutron-db-sync-hk2ck\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " pod="openstack/neutron-db-sync-hk2ck" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.638592 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-config\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.663034 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-config\") pod \"neutron-db-sync-hk2ck\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " pod="openstack/neutron-db-sync-hk2ck" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.667288 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.667544 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.670410 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-combined-ca-bundle\") pod \"neutron-db-sync-hk2ck\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " pod="openstack/neutron-db-sync-hk2ck" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.675527 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs7h6\" (UniqueName: \"kubernetes.io/projected/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-kube-api-access-hs7h6\") pod \"neutron-db-sync-hk2ck\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " pod="openstack/neutron-db-sync-hk2ck" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.709708 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pqss2" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.741763 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-swift-storage-0\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742090 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-scripts\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742154 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-logs\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742212 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742234 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-nb\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742281 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742325 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-config-data\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742371 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-config\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742720 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5gz2\" (UniqueName: \"kubernetes.io/projected/65d0b806-aeac-4b82-9792-32e5b25e3c3e-kube-api-access-d5gz2\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742793 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742843 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzgbw\" (UniqueName: \"kubernetes.io/projected/69bddfea-c4e4-4776-a5e1-2150121b98a4-kube-api-access-lzgbw\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742895 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742946 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-svc\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.742972 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-sb\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.743813 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-sb\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.744336 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-swift-storage-0\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.745774 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-config\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.746381 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-svc\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.753626 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zch8h" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.767181 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-nb\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.778147 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5gz2\" (UniqueName: \"kubernetes.io/projected/65d0b806-aeac-4b82-9792-32e5b25e3c3e-kube-api-access-d5gz2\") pod \"dnsmasq-dns-57dbb9b85f-9wjp5\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.784969 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hk2ck" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.838493 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:46 crc kubenswrapper[4475]: E1203 06:59:46.838865 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f60917-1d39-46b7-a37a-d33efdb6de0f" containerName="init" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.838885 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f60917-1d39-46b7-a37a-d33efdb6de0f" containerName="init" Dec 03 06:59:46 crc kubenswrapper[4475]: E1203 06:59:46.838896 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" containerName="registry-server" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.838902 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" containerName="registry-server" Dec 03 06:59:46 crc kubenswrapper[4475]: E1203 06:59:46.838925 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" containerName="extract-content" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.838952 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" containerName="extract-content" Dec 03 06:59:46 crc kubenswrapper[4475]: E1203 06:59:46.838964 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f60917-1d39-46b7-a37a-d33efdb6de0f" containerName="dnsmasq-dns" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.838969 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f60917-1d39-46b7-a37a-d33efdb6de0f" containerName="dnsmasq-dns" Dec 03 06:59:46 crc kubenswrapper[4475]: E1203 06:59:46.838980 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" containerName="extract-utilities" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.838986 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" containerName="extract-utilities" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.839155 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f60917-1d39-46b7-a37a-d33efdb6de0f" containerName="dnsmasq-dns" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.839181 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" containerName="registry-server" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.840024 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.844419 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-catalog-content\") pod \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.844530 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-utilities\") pod \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.844549 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2m89\" (UniqueName: \"kubernetes.io/projected/20f60917-1d39-46b7-a37a-d33efdb6de0f-kube-api-access-m2m89\") pod \"20f60917-1d39-46b7-a37a-d33efdb6de0f\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.844615 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-sb\") pod \"20f60917-1d39-46b7-a37a-d33efdb6de0f\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.844665 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-swift-storage-0\") pod \"20f60917-1d39-46b7-a37a-d33efdb6de0f\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.844690 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7whtk\" (UniqueName: \"kubernetes.io/projected/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-kube-api-access-7whtk\") pod \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\" (UID: \"d3affe8e-b674-46c4-bf4b-b6fc0d092df7\") " Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.844718 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-config\") pod \"20f60917-1d39-46b7-a37a-d33efdb6de0f\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.844751 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-svc\") pod \"20f60917-1d39-46b7-a37a-d33efdb6de0f\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.844771 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-nb\") pod \"20f60917-1d39-46b7-a37a-d33efdb6de0f\" (UID: \"20f60917-1d39-46b7-a37a-d33efdb6de0f\") " Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.845088 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-scripts\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.845121 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-logs\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.845175 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.845200 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.845220 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-config-data\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.845306 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.845345 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzgbw\" (UniqueName: \"kubernetes.io/projected/69bddfea-c4e4-4776-a5e1-2150121b98a4-kube-api-access-lzgbw\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.845382 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.845856 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.854953 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.855146 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.874271 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.889023 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-utilities" (OuterVolumeSpecName: "utilities") pod "d3affe8e-b674-46c4-bf4b-b6fc0d092df7" (UID: "d3affe8e-b674-46c4-bf4b-b6fc0d092df7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.891408 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f60917-1d39-46b7-a37a-d33efdb6de0f-kube-api-access-m2m89" (OuterVolumeSpecName: "kube-api-access-m2m89") pod "20f60917-1d39-46b7-a37a-d33efdb6de0f" (UID: "20f60917-1d39-46b7-a37a-d33efdb6de0f"). InnerVolumeSpecName "kube-api-access-m2m89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.902599 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.907297 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-logs\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.941615 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-kube-api-access-7whtk" (OuterVolumeSpecName: "kube-api-access-7whtk") pod "d3affe8e-b674-46c4-bf4b-b6fc0d092df7" (UID: "d3affe8e-b674-46c4-bf4b-b6fc0d092df7"). InnerVolumeSpecName "kube-api-access-7whtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.941974 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.948860 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.948945 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.948987 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.949067 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.949113 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.949141 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-logs\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.949183 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbr9g\" (UniqueName: \"kubernetes.io/projected/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-kube-api-access-sbr9g\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.949209 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.955743 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.955836 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2m89\" (UniqueName: \"kubernetes.io/projected/20f60917-1d39-46b7-a37a-d33efdb6de0f-kube-api-access-m2m89\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.955854 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7whtk\" (UniqueName: \"kubernetes.io/projected/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-kube-api-access-7whtk\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.956147 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3affe8e-b674-46c4-bf4b-b6fc0d092df7" (UID: "d3affe8e-b674-46c4-bf4b-b6fc0d092df7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:59:46 crc kubenswrapper[4475]: I1203 06:59:46.961670 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc7f9874c-qp4nh"] Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.022400 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "20f60917-1d39-46b7-a37a-d33efdb6de0f" (UID: "20f60917-1d39-46b7-a37a-d33efdb6de0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.022535 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.023343 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-scripts\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.023779 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.023997 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-config-data\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.027619 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.058149 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.058424 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.058544 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.058637 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.058710 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.058783 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-logs\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.058859 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbr9g\" (UniqueName: \"kubernetes.io/projected/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-kube-api-access-sbr9g\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.058929 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.059843 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.059912 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3affe8e-b674-46c4-bf4b-b6fc0d092df7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.061673 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.061894 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-logs\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.061989 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.065094 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "20f60917-1d39-46b7-a37a-d33efdb6de0f" (UID: "20f60917-1d39-46b7-a37a-d33efdb6de0f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.070119 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzgbw\" (UniqueName: \"kubernetes.io/projected/69bddfea-c4e4-4776-a5e1-2150121b98a4-kube-api-access-lzgbw\") pod \"glance-default-external-api-0\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.078689 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.079843 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.079990 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.080622 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.086042 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "20f60917-1d39-46b7-a37a-d33efdb6de0f" (UID: "20f60917-1d39-46b7-a37a-d33efdb6de0f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.094894 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" event={"ID":"20f60917-1d39-46b7-a37a-d33efdb6de0f","Type":"ContainerDied","Data":"b4bf54c0262e46240a1ab05a5b4efa5bbe6700791ba2b88e04c08fe099aa4fad"} Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.094942 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f67cb5455-r956l" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.094955 4475 scope.go:117] "RemoveContainer" containerID="74bc9df23fc5e94aa8f857357de9f3077a787e0df089f57f48d7d360e8b4f910" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.098914 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "20f60917-1d39-46b7-a37a-d33efdb6de0f" (UID: "20f60917-1d39-46b7-a37a-d33efdb6de0f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.113161 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lszxw" event={"ID":"d3affe8e-b674-46c4-bf4b-b6fc0d092df7","Type":"ContainerDied","Data":"40e0b4f81f7a967345f4cdcf4e811aae953376b0c647ce65fd815da6846490c8"} Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.113251 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lszxw" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.120391 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbr9g\" (UniqueName: \"kubernetes.io/projected/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-kube-api-access-sbr9g\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.128911 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-config" (OuterVolumeSpecName: "config") pod "20f60917-1d39-46b7-a37a-d33efdb6de0f" (UID: "20f60917-1d39-46b7-a37a-d33efdb6de0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.140987 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" event={"ID":"45a837ca-f13a-4df1-bcad-62d1ce76225d","Type":"ContainerStarted","Data":"8a7db8b6d0de43049985ad708fbd0fa3c22190fa214ebec4008e1daf87b6825b"} Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.145664 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.165151 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.165209 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.165223 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.165235 4475 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20f60917-1d39-46b7-a37a-d33efdb6de0f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.170047 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.250824 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.276796 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qrzmb"] Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.326930 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lszxw"] Dec 03 06:59:47 crc kubenswrapper[4475]: W1203 06:59:47.344574 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod875f3e6c_4ccd_4b70_8ea6_c5d76eedc268.slice/crio-526743ac788fc5b1b11bdac104f8a899bdc0193c200479fb94be3b3b445431a4 WatchSource:0}: Error finding container 526743ac788fc5b1b11bdac104f8a899bdc0193c200479fb94be3b3b445431a4: Status 404 returned error can't find the container with id 526743ac788fc5b1b11bdac104f8a899bdc0193c200479fb94be3b3b445431a4 Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.362028 4475 scope.go:117] "RemoveContainer" containerID="b6e37450f59ce53b45e922826254dd6bc43013922a71ebd01b069cb90c22ed28" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.370187 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lszxw"] Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.514194 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3affe8e-b674-46c4-bf4b-b6fc0d092df7" path="/var/lib/kubelet/pods/d3affe8e-b674-46c4-bf4b-b6fc0d092df7/volumes" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.520885 4475 scope.go:117] "RemoveContainer" containerID="4affa86553330d26a24207ded8fde03413664328301fb8a08827e1b303791ada" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.522520 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f67cb5455-r956l"] Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.531619 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f67cb5455-r956l"] Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.562810 4475 scope.go:117] "RemoveContainer" containerID="6786e4d0920435fefcbc526e8d9e7a22ec41877e153ff4fcc74af9d9809a1cdc" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.622274 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6ntz5"] Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.668784 4475 scope.go:117] "RemoveContainer" containerID="34df3fb27c46f2845cdaaeea17f1dc63255e3f2787e7e2c80255093fa4845705" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.669919 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77ccb6ff57-mkx4c"] Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.684424 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dd8b5f859-x5jv6"] Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.692907 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-bxx54"] Dec 03 06:59:47 crc kubenswrapper[4475]: W1203 06:59:47.704379 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6182da1_5b12_4bb3_af9b_3c8cb8d7ddc8.slice/crio-c65d12684da5d871613d9cf414e7387b85e668d7136a65bb2a3ee2b2babe9268 WatchSource:0}: Error finding container c65d12684da5d871613d9cf414e7387b85e668d7136a65bb2a3ee2b2babe9268: Status 404 returned error can't find the container with id c65d12684da5d871613d9cf414e7387b85e668d7136a65bb2a3ee2b2babe9268 Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.710162 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:47 crc kubenswrapper[4475]: W1203 06:59:47.711611 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7105b12e_7df5_42e5_b0cc_27ea52ea7b1c.slice/crio-5eca5fd38acb9ac324ce3db8ce13f397d6e0945b55aea483a2245e0c85156aa5 WatchSource:0}: Error finding container 5eca5fd38acb9ac324ce3db8ce13f397d6e0945b55aea483a2245e0c85156aa5: Status 404 returned error can't find the container with id 5eca5fd38acb9ac324ce3db8ce13f397d6e0945b55aea483a2245e0c85156aa5 Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.711626 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.787347 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pqss2"] Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.809875 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.825420 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:47 crc kubenswrapper[4475]: I1203 06:59:47.983407 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hk2ck"] Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.049746 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.073972 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77ccb6ff57-mkx4c"] Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.155619 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.169030 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5db4d85d89-t96xs"] Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.170285 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.178922 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77ccb6ff57-mkx4c" event={"ID":"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8","Type":"ContainerStarted","Data":"c65d12684da5d871613d9cf414e7387b85e668d7136a65bb2a3ee2b2babe9268"} Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.180829 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.192024 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bxx54" event={"ID":"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c","Type":"ContainerStarted","Data":"5eca5fd38acb9ac324ce3db8ce13f397d6e0945b55aea483a2245e0c85156aa5"} Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.197092 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5db4d85d89-t96xs"] Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.213585 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57dbb9b85f-9wjp5"] Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.227139 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qrzmb" event={"ID":"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268","Type":"ContainerStarted","Data":"cd64158ca58245973db7aeaca00a78a4ddf0872adbe2a1047c5ce26228debe0d"} Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.227168 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qrzmb" event={"ID":"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268","Type":"ContainerStarted","Data":"526743ac788fc5b1b11bdac104f8a899bdc0193c200479fb94be3b3b445431a4"} Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.237085 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6ntz5" event={"ID":"2a298c73-a9bf-496a-9192-dcbf3e2417cd","Type":"ContainerStarted","Data":"c6a240ea2e6d6cd50729781f184b1e40c6dec36978a085321933384500438fba"} Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.250255 4475 generic.go:334] "Generic (PLEG): container finished" podID="45a837ca-f13a-4df1-bcad-62d1ce76225d" containerID="3f9a15721fa637bf5c940175ff01b3a19c463bb1c046badfdb037bbd3bced5f1" exitCode=0 Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.250314 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" event={"ID":"45a837ca-f13a-4df1-bcad-62d1ce76225d","Type":"ContainerDied","Data":"3f9a15721fa637bf5c940175ff01b3a19c463bb1c046badfdb037bbd3bced5f1"} Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.269599 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dd8b5f859-x5jv6" event={"ID":"d172b2a3-c6bd-424e-8f33-25a45263d546","Type":"ContainerStarted","Data":"99c2bcf8740825692a6187b5055a337b74d164691d907c9d22d3dafef643f26f"} Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.298692 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zch8h"] Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.299001 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qrzmb" podStartSLOduration=3.298992541 podStartE2EDuration="3.298992541s" podCreationTimestamp="2025-12-03 06:59:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:59:48.263226224 +0000 UTC m=+873.068124558" watchObservedRunningTime="2025-12-03 06:59:48.298992541 +0000 UTC m=+873.103890876" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.326363 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pqss2" event={"ID":"50149f3f-08c3-4fd9-9590-b13fcd787897","Type":"ContainerStarted","Data":"a9caee36d7d17b77bd1ddce1592d22d4fa501c63bad4f8f88dbeef6455bae716"} Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.341543 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-scripts\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.341638 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9lnc\" (UniqueName: \"kubernetes.io/projected/e786a238-51fe-464f-bcc8-54d35b24e9cf-kube-api-access-g9lnc\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.341660 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e786a238-51fe-464f-bcc8-54d35b24e9cf-logs\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.341707 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-config-data\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.341742 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e786a238-51fe-464f-bcc8-54d35b24e9cf-horizon-secret-key\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.352579 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12f47969-3169-43bc-8d07-cbd3952d81cf","Type":"ContainerStarted","Data":"3bf00b5dcf3a8d6ce59e1bf97626dae6e3be34fadab64da4a55617de512b9a49"} Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.360249 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hk2ck" event={"ID":"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a","Type":"ContainerStarted","Data":"29658f15194016e755a595f0a614f2026e9cbc8a90e70eab2d9e5709acb8b414"} Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.432161 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.445904 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-config-data\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.446565 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e786a238-51fe-464f-bcc8-54d35b24e9cf-horizon-secret-key\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.446909 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-scripts\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.453328 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-config-data\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.453328 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-scripts\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.457764 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9lnc\" (UniqueName: \"kubernetes.io/projected/e786a238-51fe-464f-bcc8-54d35b24e9cf-kube-api-access-g9lnc\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.458524 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e786a238-51fe-464f-bcc8-54d35b24e9cf-logs\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.458778 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e786a238-51fe-464f-bcc8-54d35b24e9cf-logs\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.464505 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e786a238-51fe-464f-bcc8-54d35b24e9cf-horizon-secret-key\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.478360 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.497009 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9lnc\" (UniqueName: \"kubernetes.io/projected/e786a238-51fe-464f-bcc8-54d35b24e9cf-kube-api-access-g9lnc\") pod \"horizon-5db4d85d89-t96xs\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.770489 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.786219 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.873279 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-svc\") pod \"45a837ca-f13a-4df1-bcad-62d1ce76225d\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.876498 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-config\") pod \"45a837ca-f13a-4df1-bcad-62d1ce76225d\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.876659 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-swift-storage-0\") pod \"45a837ca-f13a-4df1-bcad-62d1ce76225d\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.876758 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9nxg\" (UniqueName: \"kubernetes.io/projected/45a837ca-f13a-4df1-bcad-62d1ce76225d-kube-api-access-t9nxg\") pod \"45a837ca-f13a-4df1-bcad-62d1ce76225d\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.876805 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-sb\") pod \"45a837ca-f13a-4df1-bcad-62d1ce76225d\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.876871 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-nb\") pod \"45a837ca-f13a-4df1-bcad-62d1ce76225d\" (UID: \"45a837ca-f13a-4df1-bcad-62d1ce76225d\") " Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.884044 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45a837ca-f13a-4df1-bcad-62d1ce76225d-kube-api-access-t9nxg" (OuterVolumeSpecName: "kube-api-access-t9nxg") pod "45a837ca-f13a-4df1-bcad-62d1ce76225d" (UID: "45a837ca-f13a-4df1-bcad-62d1ce76225d"). InnerVolumeSpecName "kube-api-access-t9nxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.908615 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45a837ca-f13a-4df1-bcad-62d1ce76225d" (UID: "45a837ca-f13a-4df1-bcad-62d1ce76225d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.914113 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45a837ca-f13a-4df1-bcad-62d1ce76225d" (UID: "45a837ca-f13a-4df1-bcad-62d1ce76225d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.916773 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45a837ca-f13a-4df1-bcad-62d1ce76225d" (UID: "45a837ca-f13a-4df1-bcad-62d1ce76225d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.929011 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "45a837ca-f13a-4df1-bcad-62d1ce76225d" (UID: "45a837ca-f13a-4df1-bcad-62d1ce76225d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.935631 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-config" (OuterVolumeSpecName: "config") pod "45a837ca-f13a-4df1-bcad-62d1ce76225d" (UID: "45a837ca-f13a-4df1-bcad-62d1ce76225d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.979546 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9nxg\" (UniqueName: \"kubernetes.io/projected/45a837ca-f13a-4df1-bcad-62d1ce76225d-kube-api-access-t9nxg\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.979577 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.979592 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.979604 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.979612 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:48 crc kubenswrapper[4475]: I1203 06:59:48.979621 4475 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45a837ca-f13a-4df1-bcad-62d1ce76225d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.422672 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hk2ck" event={"ID":"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a","Type":"ContainerStarted","Data":"f252b0908f155e49923598575a58960b2eb53adf2a97b6290a841f2180c1da50"} Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.429735 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" event={"ID":"65d0b806-aeac-4b82-9792-32e5b25e3c3e","Type":"ContainerDied","Data":"8454666941e5d042776b8af56b644bdf0ad79cd2839e124a01703c12b23f9c31"} Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.429836 4475 generic.go:334] "Generic (PLEG): container finished" podID="65d0b806-aeac-4b82-9792-32e5b25e3c3e" containerID="8454666941e5d042776b8af56b644bdf0ad79cd2839e124a01703c12b23f9c31" exitCode=0 Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.430053 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" event={"ID":"65d0b806-aeac-4b82-9792-32e5b25e3c3e","Type":"ContainerStarted","Data":"742b10b4188d98fb51e426dda1fbf34e8427dcf4895353b15b524e08fbed911e"} Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.466824 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c","Type":"ContainerStarted","Data":"5081c979b9562fec63d9fab3adbc9bb495281c054a663e1fca45b0e3ee4c713a"} Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.529198 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.617189 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-hk2ck" podStartSLOduration=3.617172094 podStartE2EDuration="3.617172094s" podCreationTimestamp="2025-12-03 06:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:59:49.464145334 +0000 UTC m=+874.269043678" watchObservedRunningTime="2025-12-03 06:59:49.617172094 +0000 UTC m=+874.422070428" Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.665753 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20f60917-1d39-46b7-a37a-d33efdb6de0f" path="/var/lib/kubelet/pods/20f60917-1d39-46b7-a37a-d33efdb6de0f/volumes" Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.668628 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7f9874c-qp4nh" event={"ID":"45a837ca-f13a-4df1-bcad-62d1ce76225d","Type":"ContainerDied","Data":"8a7db8b6d0de43049985ad708fbd0fa3c22190fa214ebec4008e1daf87b6825b"} Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.668668 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5db4d85d89-t96xs"] Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.668686 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zch8h" event={"ID":"8c7df369-c49c-4d2a-842a-a8bd41944f1b","Type":"ContainerStarted","Data":"f1110063056f9aaa08bd65eee60b1677157f130caaa2ec412ba699c1f03516ea"} Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.668713 4475 scope.go:117] "RemoveContainer" containerID="3f9a15721fa637bf5c940175ff01b3a19c463bb1c046badfdb037bbd3bced5f1" Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.709314 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.748497 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc7f9874c-qp4nh"] Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.752985 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc7f9874c-qp4nh"] Dec 03 06:59:49 crc kubenswrapper[4475]: I1203 06:59:49.988496 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rn44z"] Dec 03 06:59:50 crc kubenswrapper[4475]: I1203 06:59:50.588132 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" event={"ID":"65d0b806-aeac-4b82-9792-32e5b25e3c3e","Type":"ContainerStarted","Data":"ca8cb57248784eb1f8be0620f71695deeada457e8a3591dfe847ccf06804ae7a"} Dec 03 06:59:50 crc kubenswrapper[4475]: I1203 06:59:50.588491 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:50 crc kubenswrapper[4475]: I1203 06:59:50.611516 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" podStartSLOduration=4.61150177 podStartE2EDuration="4.61150177s" podCreationTimestamp="2025-12-03 06:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:59:50.607855472 +0000 UTC m=+875.412753805" watchObservedRunningTime="2025-12-03 06:59:50.61150177 +0000 UTC m=+875.416400104" Dec 03 06:59:50 crc kubenswrapper[4475]: I1203 06:59:50.618952 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c","Type":"ContainerStarted","Data":"fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd"} Dec 03 06:59:50 crc kubenswrapper[4475]: I1203 06:59:50.627377 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69bddfea-c4e4-4776-a5e1-2150121b98a4","Type":"ContainerStarted","Data":"8f08539b7331bef348b087043e37b42a72597b23f69c631fa35e6e210b021a8a"} Dec 03 06:59:50 crc kubenswrapper[4475]: I1203 06:59:50.637444 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db4d85d89-t96xs" event={"ID":"e786a238-51fe-464f-bcc8-54d35b24e9cf","Type":"ContainerStarted","Data":"600ab82ac7c0904a342765ac63ab9049e1a51766557f63ff93fc473c645ab592"} Dec 03 06:59:51 crc kubenswrapper[4475]: I1203 06:59:51.506088 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45a837ca-f13a-4df1-bcad-62d1ce76225d" path="/var/lib/kubelet/pods/45a837ca-f13a-4df1-bcad-62d1ce76225d/volumes" Dec 03 06:59:51 crc kubenswrapper[4475]: I1203 06:59:51.697235 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69bddfea-c4e4-4776-a5e1-2150121b98a4","Type":"ContainerStarted","Data":"11147d68d46c01522e43ee435bcfcdeaa53c9bf1705ddff27409d7f60fcf3505"} Dec 03 06:59:51 crc kubenswrapper[4475]: I1203 06:59:51.697340 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rn44z" podUID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" containerName="registry-server" containerID="cri-o://a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e" gracePeriod=2 Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.387782 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.491631 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-catalog-content\") pod \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.491842 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-utilities\") pod \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.491869 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msmhp\" (UniqueName: \"kubernetes.io/projected/7a25a115-16d3-44f7-82b9-5c47efbd3e47-kube-api-access-msmhp\") pod \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\" (UID: \"7a25a115-16d3-44f7-82b9-5c47efbd3e47\") " Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.494898 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-utilities" (OuterVolumeSpecName: "utilities") pod "7a25a115-16d3-44f7-82b9-5c47efbd3e47" (UID: "7a25a115-16d3-44f7-82b9-5c47efbd3e47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.518808 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a25a115-16d3-44f7-82b9-5c47efbd3e47-kube-api-access-msmhp" (OuterVolumeSpecName: "kube-api-access-msmhp") pod "7a25a115-16d3-44f7-82b9-5c47efbd3e47" (UID: "7a25a115-16d3-44f7-82b9-5c47efbd3e47"). InnerVolumeSpecName "kube-api-access-msmhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.555845 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a25a115-16d3-44f7-82b9-5c47efbd3e47" (UID: "7a25a115-16d3-44f7-82b9-5c47efbd3e47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.594752 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.594778 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a25a115-16d3-44f7-82b9-5c47efbd3e47-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.594788 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msmhp\" (UniqueName: \"kubernetes.io/projected/7a25a115-16d3-44f7-82b9-5c47efbd3e47-kube-api-access-msmhp\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.710771 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c","Type":"ContainerStarted","Data":"6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff"} Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.710847 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" containerName="glance-log" containerID="cri-o://fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd" gracePeriod=30 Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.710953 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" containerName="glance-httpd" containerID="cri-o://6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff" gracePeriod=30 Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.746269 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.746251166 podStartE2EDuration="6.746251166s" podCreationTimestamp="2025-12-03 06:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:59:52.741813498 +0000 UTC m=+877.546711832" watchObservedRunningTime="2025-12-03 06:59:52.746251166 +0000 UTC m=+877.551149500" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.757847 4475 generic.go:334] "Generic (PLEG): container finished" podID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" containerID="a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e" exitCode=0 Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.757926 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn44z" event={"ID":"7a25a115-16d3-44f7-82b9-5c47efbd3e47","Type":"ContainerDied","Data":"a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e"} Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.757989 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn44z" event={"ID":"7a25a115-16d3-44f7-82b9-5c47efbd3e47","Type":"ContainerDied","Data":"d974d3cf1659358e6b700fc82454e77f35fa678cc2917f5c45dd601060838758"} Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.758011 4475 scope.go:117] "RemoveContainer" containerID="a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.763925 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rn44z" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.827383 4475 scope.go:117] "RemoveContainer" containerID="dd6c849ce555df78d9aa7a776b4cc4c95185b9649b2b470d06b9c416d0fbf1c0" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.846713 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rn44z"] Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.854019 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rn44z"] Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.926197 4475 scope.go:117] "RemoveContainer" containerID="137279b8a133d080a65a22bbad9e4876361da777397b27fedec669e879cfff9c" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.982991 4475 scope.go:117] "RemoveContainer" containerID="a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e" Dec 03 06:59:52 crc kubenswrapper[4475]: E1203 06:59:52.984487 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e\": container with ID starting with a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e not found: ID does not exist" containerID="a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.984590 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e"} err="failed to get container status \"a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e\": rpc error: code = NotFound desc = could not find container \"a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e\": container with ID starting with a236f7232ea9142d50103d9a683f5568c1dae4b36298c67ca32bd40db43df93e not found: ID does not exist" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.984670 4475 scope.go:117] "RemoveContainer" containerID="dd6c849ce555df78d9aa7a776b4cc4c95185b9649b2b470d06b9c416d0fbf1c0" Dec 03 06:59:52 crc kubenswrapper[4475]: E1203 06:59:52.985375 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd6c849ce555df78d9aa7a776b4cc4c95185b9649b2b470d06b9c416d0fbf1c0\": container with ID starting with dd6c849ce555df78d9aa7a776b4cc4c95185b9649b2b470d06b9c416d0fbf1c0 not found: ID does not exist" containerID="dd6c849ce555df78d9aa7a776b4cc4c95185b9649b2b470d06b9c416d0fbf1c0" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.985414 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6c849ce555df78d9aa7a776b4cc4c95185b9649b2b470d06b9c416d0fbf1c0"} err="failed to get container status \"dd6c849ce555df78d9aa7a776b4cc4c95185b9649b2b470d06b9c416d0fbf1c0\": rpc error: code = NotFound desc = could not find container \"dd6c849ce555df78d9aa7a776b4cc4c95185b9649b2b470d06b9c416d0fbf1c0\": container with ID starting with dd6c849ce555df78d9aa7a776b4cc4c95185b9649b2b470d06b9c416d0fbf1c0 not found: ID does not exist" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.985464 4475 scope.go:117] "RemoveContainer" containerID="137279b8a133d080a65a22bbad9e4876361da777397b27fedec669e879cfff9c" Dec 03 06:59:52 crc kubenswrapper[4475]: E1203 06:59:52.986023 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137279b8a133d080a65a22bbad9e4876361da777397b27fedec669e879cfff9c\": container with ID starting with 137279b8a133d080a65a22bbad9e4876361da777397b27fedec669e879cfff9c not found: ID does not exist" containerID="137279b8a133d080a65a22bbad9e4876361da777397b27fedec669e879cfff9c" Dec 03 06:59:52 crc kubenswrapper[4475]: I1203 06:59:52.986105 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137279b8a133d080a65a22bbad9e4876361da777397b27fedec669e879cfff9c"} err="failed to get container status \"137279b8a133d080a65a22bbad9e4876361da777397b27fedec669e879cfff9c\": rpc error: code = NotFound desc = could not find container \"137279b8a133d080a65a22bbad9e4876361da777397b27fedec669e879cfff9c\": container with ID starting with 137279b8a133d080a65a22bbad9e4876361da777397b27fedec669e879cfff9c not found: ID does not exist" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.502200 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" path="/var/lib/kubelet/pods/7a25a115-16d3-44f7-82b9-5c47efbd3e47/volumes" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.755493 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.774055 4475 generic.go:334] "Generic (PLEG): container finished" podID="875f3e6c-4ccd-4b70-8ea6-c5d76eedc268" containerID="cd64158ca58245973db7aeaca00a78a4ddf0872adbe2a1047c5ce26228debe0d" exitCode=0 Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.774168 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qrzmb" event={"ID":"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268","Type":"ContainerDied","Data":"cd64158ca58245973db7aeaca00a78a4ddf0872adbe2a1047c5ce26228debe0d"} Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.789929 4475 generic.go:334] "Generic (PLEG): container finished" podID="bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" containerID="6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff" exitCode=0 Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.789985 4475 generic.go:334] "Generic (PLEG): container finished" podID="bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" containerID="fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd" exitCode=143 Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.790058 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c","Type":"ContainerDied","Data":"6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff"} Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.790108 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c","Type":"ContainerDied","Data":"fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd"} Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.790127 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c","Type":"ContainerDied","Data":"5081c979b9562fec63d9fab3adbc9bb495281c054a663e1fca45b0e3ee4c713a"} Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.790153 4475 scope.go:117] "RemoveContainer" containerID="6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.790180 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.798621 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69bddfea-c4e4-4776-a5e1-2150121b98a4","Type":"ContainerStarted","Data":"fda1c00350ac41bf075275986f1ff9d6e26e93a690741e896f741ab14e567728"} Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.798755 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="69bddfea-c4e4-4776-a5e1-2150121b98a4" containerName="glance-log" containerID="cri-o://11147d68d46c01522e43ee435bcfcdeaa53c9bf1705ddff27409d7f60fcf3505" gracePeriod=30 Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.798995 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="69bddfea-c4e4-4776-a5e1-2150121b98a4" containerName="glance-httpd" containerID="cri-o://fda1c00350ac41bf075275986f1ff9d6e26e93a690741e896f741ab14e567728" gracePeriod=30 Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.835921 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.835903578 podStartE2EDuration="7.835903578s" podCreationTimestamp="2025-12-03 06:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:59:53.824423464 +0000 UTC m=+878.629321818" watchObservedRunningTime="2025-12-03 06:59:53.835903578 +0000 UTC m=+878.640801902" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.865108 4475 scope.go:117] "RemoveContainer" containerID="fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.874179 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbr9g\" (UniqueName: \"kubernetes.io/projected/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-kube-api-access-sbr9g\") pod \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.874259 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-logs\") pod \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.874294 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-httpd-run\") pod \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.874355 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.874385 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-combined-ca-bundle\") pod \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.874478 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-config-data\") pod \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.874735 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-scripts\") pod \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.874854 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-internal-tls-certs\") pod \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\" (UID: \"bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c\") " Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.875849 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-logs" (OuterVolumeSpecName: "logs") pod "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" (UID: "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.875865 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" (UID: "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.881727 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-kube-api-access-sbr9g" (OuterVolumeSpecName: "kube-api-access-sbr9g") pod "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" (UID: "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c"). InnerVolumeSpecName "kube-api-access-sbr9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.888638 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" (UID: "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.905409 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-scripts" (OuterVolumeSpecName: "scripts") pod "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" (UID: "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.910661 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" (UID: "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.954278 4475 scope.go:117] "RemoveContainer" containerID="6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff" Dec 03 06:59:53 crc kubenswrapper[4475]: E1203 06:59:53.957195 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff\": container with ID starting with 6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff not found: ID does not exist" containerID="6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.957250 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff"} err="failed to get container status \"6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff\": rpc error: code = NotFound desc = could not find container \"6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff\": container with ID starting with 6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff not found: ID does not exist" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.957283 4475 scope.go:117] "RemoveContainer" containerID="fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd" Dec 03 06:59:53 crc kubenswrapper[4475]: E1203 06:59:53.961532 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd\": container with ID starting with fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd not found: ID does not exist" containerID="fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.961570 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd"} err="failed to get container status \"fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd\": rpc error: code = NotFound desc = could not find container \"fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd\": container with ID starting with fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd not found: ID does not exist" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.961591 4475 scope.go:117] "RemoveContainer" containerID="6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.963512 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff"} err="failed to get container status \"6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff\": rpc error: code = NotFound desc = could not find container \"6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff\": container with ID starting with 6c27013f31c8380282d2d698ddb61be42754d789fa71070c4819fc27dfd15eff not found: ID does not exist" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.963538 4475 scope.go:117] "RemoveContainer" containerID="fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.963712 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" (UID: "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.967761 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd"} err="failed to get container status \"fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd\": rpc error: code = NotFound desc = could not find container \"fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd\": container with ID starting with fb8f6692a60c358b7df3307b291d4ffc720d0f301301e4da1766386f06ffe6cd not found: ID does not exist" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.973017 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-config-data" (OuterVolumeSpecName: "config-data") pod "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" (UID: "bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.977846 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.977919 4475 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.977983 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbr9g\" (UniqueName: \"kubernetes.io/projected/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-kube-api-access-sbr9g\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.978054 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.978064 4475 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.978127 4475 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.978138 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.978147 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:53 crc kubenswrapper[4475]: I1203 06:59:53.997370 4475 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.088751 4475 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.193011 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.205440 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.222382 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:54 crc kubenswrapper[4475]: E1203 06:59:54.222814 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" containerName="extract-content" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.222829 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" containerName="extract-content" Dec 03 06:59:54 crc kubenswrapper[4475]: E1203 06:59:54.222858 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a837ca-f13a-4df1-bcad-62d1ce76225d" containerName="init" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.222864 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a837ca-f13a-4df1-bcad-62d1ce76225d" containerName="init" Dec 03 06:59:54 crc kubenswrapper[4475]: E1203 06:59:54.222875 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" containerName="glance-httpd" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.222880 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" containerName="glance-httpd" Dec 03 06:59:54 crc kubenswrapper[4475]: E1203 06:59:54.222889 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" containerName="extract-utilities" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.222895 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" containerName="extract-utilities" Dec 03 06:59:54 crc kubenswrapper[4475]: E1203 06:59:54.222918 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" containerName="registry-server" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.222924 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" containerName="registry-server" Dec 03 06:59:54 crc kubenswrapper[4475]: E1203 06:59:54.222932 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" containerName="glance-log" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.222938 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" containerName="glance-log" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.223111 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="45a837ca-f13a-4df1-bcad-62d1ce76225d" containerName="init" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.223127 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" containerName="glance-log" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.223137 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" containerName="glance-httpd" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.223146 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a25a115-16d3-44f7-82b9-5c47efbd3e47" containerName="registry-server" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.224141 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.227067 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.227292 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.255368 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.396342 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dd8b5f859-x5jv6"] Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.400673 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-logs\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.400704 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.400740 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7p72\" (UniqueName: \"kubernetes.io/projected/e87567b5-4d6e-4f8a-be68-25d4c857ad19-kube-api-access-v7p72\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.400765 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.400778 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.400804 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.400821 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.400853 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.427148 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7fc4d79b88-s8hhg"] Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.428367 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.444923 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.464398 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:54 crc kubenswrapper[4475]: E1203 06:59:54.494012 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-v7p72 logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="e87567b5-4d6e-4f8a-be68-25d4c857ad19" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.508162 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fc4d79b88-s8hhg"] Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.510279 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.510372 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.510415 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-secret-key\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.510482 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-combined-ca-bundle\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.510505 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-logs\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.510528 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-tls-certs\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.510571 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-scripts\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.510596 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-248ht\" (UniqueName: \"kubernetes.io/projected/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-kube-api-access-248ht\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.510640 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-logs\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.510669 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.514071 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7p72\" (UniqueName: \"kubernetes.io/projected/e87567b5-4d6e-4f8a-be68-25d4c857ad19-kube-api-access-v7p72\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.514136 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.514159 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.514218 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-config-data\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.514253 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.526424 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-logs\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.526684 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.538950 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.545650 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.568113 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7p72\" (UniqueName: \"kubernetes.io/projected/e87567b5-4d6e-4f8a-be68-25d4c857ad19-kube-api-access-v7p72\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.609388 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.614039 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.616666 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-config-data\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.616860 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-secret-key\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.616953 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-combined-ca-bundle\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.616982 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-logs\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.617013 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-tls-certs\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.617088 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-scripts\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.617110 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-248ht\" (UniqueName: \"kubernetes.io/projected/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-kube-api-access-248ht\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.622631 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-logs\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.623131 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-config-data\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.623994 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-scripts\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.640424 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-tls-certs\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.644510 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-248ht\" (UniqueName: \"kubernetes.io/projected/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-kube-api-access-248ht\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.645417 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5db4d85d89-t96xs"] Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.659089 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.667594 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-695fd7c4bb-h85zh"] Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.680000 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.723031 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-695fd7c4bb-h85zh"] Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.741829 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-secret-key\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.743776 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f10037d-d109-4863-b89f-c33f39b0848a-horizon-secret-key\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.743991 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f10037d-d109-4863-b89f-c33f39b0848a-combined-ca-bundle\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.744117 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f10037d-d109-4863-b89f-c33f39b0848a-config-data\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.744153 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f10037d-d109-4863-b89f-c33f39b0848a-horizon-tls-certs\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.744160 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-combined-ca-bundle\") pod \"horizon-7fc4d79b88-s8hhg\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.744197 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjkgf\" (UniqueName: \"kubernetes.io/projected/1f10037d-d109-4863-b89f-c33f39b0848a-kube-api-access-pjkgf\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.744572 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f10037d-d109-4863-b89f-c33f39b0848a-logs\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.747308 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f10037d-d109-4863-b89f-c33f39b0848a-scripts\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.757087 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.823785 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.841379 4475 generic.go:334] "Generic (PLEG): container finished" podID="69bddfea-c4e4-4776-a5e1-2150121b98a4" containerID="fda1c00350ac41bf075275986f1ff9d6e26e93a690741e896f741ab14e567728" exitCode=0 Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.841403 4475 generic.go:334] "Generic (PLEG): container finished" podID="69bddfea-c4e4-4776-a5e1-2150121b98a4" containerID="11147d68d46c01522e43ee435bcfcdeaa53c9bf1705ddff27409d7f60fcf3505" exitCode=143 Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.841461 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69bddfea-c4e4-4776-a5e1-2150121b98a4","Type":"ContainerDied","Data":"fda1c00350ac41bf075275986f1ff9d6e26e93a690741e896f741ab14e567728"} Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.841487 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.841510 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69bddfea-c4e4-4776-a5e1-2150121b98a4","Type":"ContainerDied","Data":"11147d68d46c01522e43ee435bcfcdeaa53c9bf1705ddff27409d7f60fcf3505"} Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.848029 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f10037d-d109-4863-b89f-c33f39b0848a-config-data\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.848057 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f10037d-d109-4863-b89f-c33f39b0848a-horizon-tls-certs\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.848076 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjkgf\" (UniqueName: \"kubernetes.io/projected/1f10037d-d109-4863-b89f-c33f39b0848a-kube-api-access-pjkgf\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.848109 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f10037d-d109-4863-b89f-c33f39b0848a-logs\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.848153 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f10037d-d109-4863-b89f-c33f39b0848a-scripts\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.848172 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f10037d-d109-4863-b89f-c33f39b0848a-horizon-secret-key\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.848214 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f10037d-d109-4863-b89f-c33f39b0848a-combined-ca-bundle\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.849051 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f10037d-d109-4863-b89f-c33f39b0848a-config-data\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.849079 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f10037d-d109-4863-b89f-c33f39b0848a-logs\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.849421 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.849814 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f10037d-d109-4863-b89f-c33f39b0848a-scripts\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.852316 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f10037d-d109-4863-b89f-c33f39b0848a-horizon-secret-key\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.852842 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f10037d-d109-4863-b89f-c33f39b0848a-horizon-tls-certs\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.855110 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f10037d-d109-4863-b89f-c33f39b0848a-combined-ca-bundle\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.864920 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjkgf\" (UniqueName: \"kubernetes.io/projected/1f10037d-d109-4863-b89f-c33f39b0848a-kube-api-access-pjkgf\") pod \"horizon-695fd7c4bb-h85zh\" (UID: \"1f10037d-d109-4863-b89f-c33f39b0848a\") " pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.949497 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-httpd-run\") pod \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.949538 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7p72\" (UniqueName: \"kubernetes.io/projected/e87567b5-4d6e-4f8a-be68-25d4c857ad19-kube-api-access-v7p72\") pod \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.949573 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-logs\") pod \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.949623 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-combined-ca-bundle\") pod \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.949667 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-internal-tls-certs\") pod \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.949716 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-config-data\") pod \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.949834 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-scripts\") pod \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.949902 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\" (UID: \"e87567b5-4d6e-4f8a-be68-25d4c857ad19\") " Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.950589 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-logs" (OuterVolumeSpecName: "logs") pod "e87567b5-4d6e-4f8a-be68-25d4c857ad19" (UID: "e87567b5-4d6e-4f8a-be68-25d4c857ad19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.951137 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e87567b5-4d6e-4f8a-be68-25d4c857ad19" (UID: "e87567b5-4d6e-4f8a-be68-25d4c857ad19"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.956563 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e87567b5-4d6e-4f8a-be68-25d4c857ad19" (UID: "e87567b5-4d6e-4f8a-be68-25d4c857ad19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.957401 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-config-data" (OuterVolumeSpecName: "config-data") pod "e87567b5-4d6e-4f8a-be68-25d4c857ad19" (UID: "e87567b5-4d6e-4f8a-be68-25d4c857ad19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.958979 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e87567b5-4d6e-4f8a-be68-25d4c857ad19" (UID: "e87567b5-4d6e-4f8a-be68-25d4c857ad19"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.958923 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87567b5-4d6e-4f8a-be68-25d4c857ad19-kube-api-access-v7p72" (OuterVolumeSpecName: "kube-api-access-v7p72") pod "e87567b5-4d6e-4f8a-be68-25d4c857ad19" (UID: "e87567b5-4d6e-4f8a-be68-25d4c857ad19"). InnerVolumeSpecName "kube-api-access-v7p72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.960703 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-scripts" (OuterVolumeSpecName: "scripts") pod "e87567b5-4d6e-4f8a-be68-25d4c857ad19" (UID: "e87567b5-4d6e-4f8a-be68-25d4c857ad19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:59:54 crc kubenswrapper[4475]: I1203 06:59:54.981604 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "e87567b5-4d6e-4f8a-be68-25d4c857ad19" (UID: "e87567b5-4d6e-4f8a-be68-25d4c857ad19"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.032695 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.051581 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.051635 4475 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.051650 4475 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.051662 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7p72\" (UniqueName: \"kubernetes.io/projected/e87567b5-4d6e-4f8a-be68-25d4c857ad19-kube-api-access-v7p72\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.051674 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e87567b5-4d6e-4f8a-be68-25d4c857ad19-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.051683 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.051692 4475 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.051701 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87567b5-4d6e-4f8a-be68-25d4c857ad19-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.069794 4475 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.153222 4475 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.507864 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c" path="/var/lib/kubelet/pods/bd5f5f2c-acd2-49b4-9b5c-5e35d57ed85c/volumes" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.586344 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bk4n9"] Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.592890 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bk4n9"] Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.593071 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.673871 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-utilities\") pod \"redhat-operators-bk4n9\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.674198 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-catalog-content\") pod \"redhat-operators-bk4n9\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.674396 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr2x4\" (UniqueName: \"kubernetes.io/projected/cd709367-66b2-4586-b3ba-424d4c1532ee-kube-api-access-nr2x4\") pod \"redhat-operators-bk4n9\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.777324 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-utilities\") pod \"redhat-operators-bk4n9\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.777613 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-catalog-content\") pod \"redhat-operators-bk4n9\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.777729 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr2x4\" (UniqueName: \"kubernetes.io/projected/cd709367-66b2-4586-b3ba-424d4c1532ee-kube-api-access-nr2x4\") pod \"redhat-operators-bk4n9\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.778398 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-utilities\") pod \"redhat-operators-bk4n9\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.778444 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-catalog-content\") pod \"redhat-operators-bk4n9\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.795021 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr2x4\" (UniqueName: \"kubernetes.io/projected/cd709367-66b2-4586-b3ba-424d4c1532ee-kube-api-access-nr2x4\") pod \"redhat-operators-bk4n9\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.849973 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.896026 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.904818 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.919919 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.923345 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.925529 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.927995 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.928128 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 06:59:55 crc kubenswrapper[4475]: I1203 06:59:55.960589 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.094674 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.094738 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.094816 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.094887 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-logs\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.094974 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.095059 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxnd5\" (UniqueName: \"kubernetes.io/projected/2604ecb1-3054-4b63-9a8d-3880ee519c58-kube-api-access-nxnd5\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.095093 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.095126 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.196863 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.196958 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxnd5\" (UniqueName: \"kubernetes.io/projected/2604ecb1-3054-4b63-9a8d-3880ee519c58-kube-api-access-nxnd5\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.196985 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.197025 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.197043 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.197066 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.197116 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.197195 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-logs\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.197824 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-logs\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.198116 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.199430 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.206959 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.207396 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.212180 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.212899 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxnd5\" (UniqueName: \"kubernetes.io/projected/2604ecb1-3054-4b63-9a8d-3880ee519c58-kube-api-access-nxnd5\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.224394 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.231651 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.261405 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.895293 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.968833 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6476ddd6b5-ppjlc"] Dec 03 06:59:56 crc kubenswrapper[4475]: I1203 06:59:56.969038 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" podUID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerName="dnsmasq-dns" containerID="cri-o://c9d9f24e460f40acbdabd6a1c9925db16f56c53012d32344327cd5e2a4943c14" gracePeriod=10 Dec 03 06:59:57 crc kubenswrapper[4475]: I1203 06:59:57.498841 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e87567b5-4d6e-4f8a-be68-25d4c857ad19" path="/var/lib/kubelet/pods/e87567b5-4d6e-4f8a-be68-25d4c857ad19/volumes" Dec 03 06:59:57 crc kubenswrapper[4475]: I1203 06:59:57.871251 4475 generic.go:334] "Generic (PLEG): container finished" podID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerID="c9d9f24e460f40acbdabd6a1c9925db16f56c53012d32344327cd5e2a4943c14" exitCode=0 Dec 03 06:59:57 crc kubenswrapper[4475]: I1203 06:59:57.871285 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" event={"ID":"87fdba6a-e11c-4bbd-becc-78999065efa8","Type":"ContainerDied","Data":"c9d9f24e460f40acbdabd6a1c9925db16f56c53012d32344327cd5e2a4943c14"} Dec 03 06:59:59 crc kubenswrapper[4475]: I1203 06:59:59.885893 4475 generic.go:334] "Generic (PLEG): container finished" podID="26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a" containerID="f252b0908f155e49923598575a58960b2eb53adf2a97b6290a841f2180c1da50" exitCode=0 Dec 03 06:59:59 crc kubenswrapper[4475]: I1203 06:59:59.886058 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hk2ck" event={"ID":"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a","Type":"ContainerDied","Data":"f252b0908f155e49923598575a58960b2eb53adf2a97b6290a841f2180c1da50"} Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.143919 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl"] Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.144983 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.146975 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.148594 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.162610 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl"] Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.179218 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9de8472e-1891-4e1c-8db7-fb458c212969-config-volume\") pod \"collect-profiles-29412420-2tfcl\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.179263 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9de8472e-1891-4e1c-8db7-fb458c212969-secret-volume\") pod \"collect-profiles-29412420-2tfcl\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.179299 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cg2h\" (UniqueName: \"kubernetes.io/projected/9de8472e-1891-4e1c-8db7-fb458c212969-kube-api-access-9cg2h\") pod \"collect-profiles-29412420-2tfcl\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.280430 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9de8472e-1891-4e1c-8db7-fb458c212969-config-volume\") pod \"collect-profiles-29412420-2tfcl\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.281215 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9de8472e-1891-4e1c-8db7-fb458c212969-secret-volume\") pod \"collect-profiles-29412420-2tfcl\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.281167 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9de8472e-1891-4e1c-8db7-fb458c212969-config-volume\") pod \"collect-profiles-29412420-2tfcl\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.281380 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cg2h\" (UniqueName: \"kubernetes.io/projected/9de8472e-1891-4e1c-8db7-fb458c212969-kube-api-access-9cg2h\") pod \"collect-profiles-29412420-2tfcl\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.296124 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cg2h\" (UniqueName: \"kubernetes.io/projected/9de8472e-1891-4e1c-8db7-fb458c212969-kube-api-access-9cg2h\") pod \"collect-profiles-29412420-2tfcl\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.302583 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9de8472e-1891-4e1c-8db7-fb458c212969-secret-volume\") pod \"collect-profiles-29412420-2tfcl\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.464105 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:00 crc kubenswrapper[4475]: I1203 07:00:00.847065 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" podUID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: connect: connection refused" Dec 03 07:00:03 crc kubenswrapper[4475]: I1203 07:00:03.921074 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hk2ck" event={"ID":"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a","Type":"ContainerDied","Data":"29658f15194016e755a595f0a614f2026e9cbc8a90e70eab2d9e5709acb8b414"} Dec 03 07:00:03 crc kubenswrapper[4475]: I1203 07:00:03.921484 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29658f15194016e755a595f0a614f2026e9cbc8a90e70eab2d9e5709acb8b414" Dec 03 07:00:03 crc kubenswrapper[4475]: I1203 07:00:03.947208 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hk2ck" Dec 03 07:00:04 crc kubenswrapper[4475]: I1203 07:00:04.038415 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-combined-ca-bundle\") pod \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " Dec 03 07:00:04 crc kubenswrapper[4475]: I1203 07:00:04.038555 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-config\") pod \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " Dec 03 07:00:04 crc kubenswrapper[4475]: I1203 07:00:04.038581 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs7h6\" (UniqueName: \"kubernetes.io/projected/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-kube-api-access-hs7h6\") pod \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\" (UID: \"26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a\") " Dec 03 07:00:04 crc kubenswrapper[4475]: I1203 07:00:04.053708 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-kube-api-access-hs7h6" (OuterVolumeSpecName: "kube-api-access-hs7h6") pod "26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a" (UID: "26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a"). InnerVolumeSpecName "kube-api-access-hs7h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:04 crc kubenswrapper[4475]: I1203 07:00:04.084209 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-config" (OuterVolumeSpecName: "config") pod "26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a" (UID: "26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:04 crc kubenswrapper[4475]: I1203 07:00:04.087616 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a" (UID: "26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:04 crc kubenswrapper[4475]: I1203 07:00:04.140496 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:04 crc kubenswrapper[4475]: I1203 07:00:04.140531 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs7h6\" (UniqueName: \"kubernetes.io/projected/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-kube-api-access-hs7h6\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:04 crc kubenswrapper[4475]: I1203 07:00:04.140547 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:04 crc kubenswrapper[4475]: E1203 07:00:04.269676 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-central:65066e8ca260a75886ae57f157049605" Dec 03 07:00:04 crc kubenswrapper[4475]: E1203 07:00:04.269753 4475 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-central:65066e8ca260a75886ae57f157049605" Dec 03 07:00:04 crc kubenswrapper[4475]: E1203 07:00:04.269975 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-central:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n677h596h97h688h6bhfbh84hcdh54bh689h65fh565h5d9h5ddh5bhdhc8hb9h5f8h5ffh5c4h59dhd9h5c8h676h58ch58fh547h5dbhd8hf7h5f5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kn5xv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(12f47969-3169-43bc-8d07-cbd3952d81cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:00:04 crc kubenswrapper[4475]: I1203 07:00:04.927237 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hk2ck" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.128160 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c7bf79b95-knv75"] Dec 03 07:00:05 crc kubenswrapper[4475]: E1203 07:00:05.131834 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a" containerName="neutron-db-sync" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.131871 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a" containerName="neutron-db-sync" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.132910 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a" containerName="neutron-db-sync" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.137294 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: E1203 07:00:05.144551 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26e8d9b9_9ab9_428f_9b0c_78a50bd71e7a.slice/crio-29658f15194016e755a595f0a614f2026e9cbc8a90e70eab2d9e5709acb8b414\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26e8d9b9_9ab9_428f_9b0c_78a50bd71e7a.slice\": RecentStats: unable to find data in memory cache]" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.146063 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c7bf79b95-knv75"] Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.161490 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-swift-storage-0\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.161571 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq7m2\" (UniqueName: \"kubernetes.io/projected/f1be585f-cb41-49eb-83b7-9c25c757c739-kube-api-access-nq7m2\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.161610 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-config\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.161651 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.161672 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-svc\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.162222 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.255957 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7546fbf47b-6g4gw"] Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.257267 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.259572 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.261719 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.261837 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.262009 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9fx2m" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.264053 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.264263 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-config\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.264409 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-swift-storage-0\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.264537 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq7m2\" (UniqueName: \"kubernetes.io/projected/f1be585f-cb41-49eb-83b7-9c25c757c739-kube-api-access-nq7m2\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.264639 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-combined-ca-bundle\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.264707 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-ovndb-tls-certs\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.264784 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-config\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.264851 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-httpd-config\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.264965 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z57rd\" (UniqueName: \"kubernetes.io/projected/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-kube-api-access-z57rd\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.265055 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.265142 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-svc\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.265816 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.265839 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-config\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.266247 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-svc\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.270383 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.270857 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-swift-storage-0\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.276514 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7546fbf47b-6g4gw"] Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.290462 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq7m2\" (UniqueName: \"kubernetes.io/projected/f1be585f-cb41-49eb-83b7-9c25c757c739-kube-api-access-nq7m2\") pod \"dnsmasq-dns-6c7bf79b95-knv75\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.366241 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-config\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.366322 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-combined-ca-bundle\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.366344 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-ovndb-tls-certs\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.366365 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-httpd-config\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.366390 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z57rd\" (UniqueName: \"kubernetes.io/projected/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-kube-api-access-z57rd\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.371036 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-combined-ca-bundle\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.372313 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-httpd-config\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.372405 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-ovndb-tls-certs\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.383598 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z57rd\" (UniqueName: \"kubernetes.io/projected/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-kube-api-access-z57rd\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.385658 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-config\") pod \"neutron-7546fbf47b-6g4gw\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.462842 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.575939 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:05 crc kubenswrapper[4475]: I1203 07:00:05.847026 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" podUID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: connect: connection refused" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.422643 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.427174 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qrzmb" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.596873 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-scripts\") pod \"69bddfea-c4e4-4776-a5e1-2150121b98a4\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.596969 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-scripts\") pod \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.597054 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-httpd-run\") pod \"69bddfea-c4e4-4776-a5e1-2150121b98a4\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.597080 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-fernet-keys\") pod \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.597112 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xddj\" (UniqueName: \"kubernetes.io/projected/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-kube-api-access-6xddj\") pod \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.597133 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-logs\") pod \"69bddfea-c4e4-4776-a5e1-2150121b98a4\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.597638 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "69bddfea-c4e4-4776-a5e1-2150121b98a4" (UID: "69bddfea-c4e4-4776-a5e1-2150121b98a4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.597272 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"69bddfea-c4e4-4776-a5e1-2150121b98a4\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.598118 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-logs" (OuterVolumeSpecName: "logs") pod "69bddfea-c4e4-4776-a5e1-2150121b98a4" (UID: "69bddfea-c4e4-4776-a5e1-2150121b98a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.598128 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-combined-ca-bundle\") pod \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.598204 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-public-tls-certs\") pod \"69bddfea-c4e4-4776-a5e1-2150121b98a4\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.598269 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-combined-ca-bundle\") pod \"69bddfea-c4e4-4776-a5e1-2150121b98a4\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.598304 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzgbw\" (UniqueName: \"kubernetes.io/projected/69bddfea-c4e4-4776-a5e1-2150121b98a4-kube-api-access-lzgbw\") pod \"69bddfea-c4e4-4776-a5e1-2150121b98a4\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.598367 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-config-data\") pod \"69bddfea-c4e4-4776-a5e1-2150121b98a4\" (UID: \"69bddfea-c4e4-4776-a5e1-2150121b98a4\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.598437 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-credential-keys\") pod \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.598487 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-config-data\") pod \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\" (UID: \"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268\") " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.599232 4475 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.599249 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69bddfea-c4e4-4776-a5e1-2150121b98a4-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.603973 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69bddfea-c4e4-4776-a5e1-2150121b98a4-kube-api-access-lzgbw" (OuterVolumeSpecName: "kube-api-access-lzgbw") pod "69bddfea-c4e4-4776-a5e1-2150121b98a4" (UID: "69bddfea-c4e4-4776-a5e1-2150121b98a4"). InnerVolumeSpecName "kube-api-access-lzgbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.610496 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-scripts" (OuterVolumeSpecName: "scripts") pod "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268" (UID: "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.610511 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "69bddfea-c4e4-4776-a5e1-2150121b98a4" (UID: "69bddfea-c4e4-4776-a5e1-2150121b98a4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.610548 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-kube-api-access-6xddj" (OuterVolumeSpecName: "kube-api-access-6xddj") pod "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268" (UID: "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268"). InnerVolumeSpecName "kube-api-access-6xddj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.610579 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-scripts" (OuterVolumeSpecName: "scripts") pod "69bddfea-c4e4-4776-a5e1-2150121b98a4" (UID: "69bddfea-c4e4-4776-a5e1-2150121b98a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.610583 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268" (UID: "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.617602 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268" (UID: "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.631012 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69bddfea-c4e4-4776-a5e1-2150121b98a4" (UID: "69bddfea-c4e4-4776-a5e1-2150121b98a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.634309 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-config-data" (OuterVolumeSpecName: "config-data") pod "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268" (UID: "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.638151 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268" (UID: "875f3e6c-4ccd-4b70-8ea6-c5d76eedc268"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.645613 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-config-data" (OuterVolumeSpecName: "config-data") pod "69bddfea-c4e4-4776-a5e1-2150121b98a4" (UID: "69bddfea-c4e4-4776-a5e1-2150121b98a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.658864 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "69bddfea-c4e4-4776-a5e1-2150121b98a4" (UID: "69bddfea-c4e4-4776-a5e1-2150121b98a4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701575 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701609 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701651 4475 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701662 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xddj\" (UniqueName: \"kubernetes.io/projected/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-kube-api-access-6xddj\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701706 4475 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701715 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701723 4475 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701827 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701839 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzgbw\" (UniqueName: \"kubernetes.io/projected/69bddfea-c4e4-4776-a5e1-2150121b98a4-kube-api-access-lzgbw\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701847 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bddfea-c4e4-4776-a5e1-2150121b98a4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701855 4475 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.701863 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.719759 4475 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.803757 4475 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.947881 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qrzmb" event={"ID":"875f3e6c-4ccd-4b70-8ea6-c5d76eedc268","Type":"ContainerDied","Data":"526743ac788fc5b1b11bdac104f8a899bdc0193c200479fb94be3b3b445431a4"} Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.947920 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="526743ac788fc5b1b11bdac104f8a899bdc0193c200479fb94be3b3b445431a4" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.947977 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qrzmb" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.955640 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69bddfea-c4e4-4776-a5e1-2150121b98a4","Type":"ContainerDied","Data":"8f08539b7331bef348b087043e37b42a72597b23f69c631fa35e6e210b021a8a"} Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.955706 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.955722 4475 scope.go:117] "RemoveContainer" containerID="fda1c00350ac41bf075275986f1ff9d6e26e93a690741e896f741ab14e567728" Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.986931 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:00:06 crc kubenswrapper[4475]: I1203 07:00:06.996856 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.012968 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:00:07 crc kubenswrapper[4475]: E1203 07:00:07.013232 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bddfea-c4e4-4776-a5e1-2150121b98a4" containerName="glance-log" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.013246 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bddfea-c4e4-4776-a5e1-2150121b98a4" containerName="glance-log" Dec 03 07:00:07 crc kubenswrapper[4475]: E1203 07:00:07.013275 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875f3e6c-4ccd-4b70-8ea6-c5d76eedc268" containerName="keystone-bootstrap" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.013281 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="875f3e6c-4ccd-4b70-8ea6-c5d76eedc268" containerName="keystone-bootstrap" Dec 03 07:00:07 crc kubenswrapper[4475]: E1203 07:00:07.013295 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bddfea-c4e4-4776-a5e1-2150121b98a4" containerName="glance-httpd" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.013300 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bddfea-c4e4-4776-a5e1-2150121b98a4" containerName="glance-httpd" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.013446 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="69bddfea-c4e4-4776-a5e1-2150121b98a4" containerName="glance-log" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.013518 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="875f3e6c-4ccd-4b70-8ea6-c5d76eedc268" containerName="keystone-bootstrap" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.013532 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="69bddfea-c4e4-4776-a5e1-2150121b98a4" containerName="glance-httpd" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.014294 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.016311 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.022397 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.031070 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.213102 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.213332 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.213524 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.213723 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.213828 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.213885 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt4vd\" (UniqueName: \"kubernetes.io/projected/5d024ccd-9b9c-4656-a6c9-88c6d524960c-kube-api-access-nt4vd\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.213944 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.214002 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-logs\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.315866 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.316011 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.316060 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.316105 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt4vd\" (UniqueName: \"kubernetes.io/projected/5d024ccd-9b9c-4656-a6c9-88c6d524960c-kube-api-access-nt4vd\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.316156 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.316177 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-logs\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.316224 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.316301 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.317183 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.318342 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-logs\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.318832 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.324572 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.325813 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.329257 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.332536 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.334395 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt4vd\" (UniqueName: \"kubernetes.io/projected/5d024ccd-9b9c-4656-a6c9-88c6d524960c-kube-api-access-nt4vd\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.342749 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.436478 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8798f5df-qtg6w"] Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.438164 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.443309 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.443683 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.466856 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8798f5df-qtg6w"] Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.507312 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69bddfea-c4e4-4776-a5e1-2150121b98a4" path="/var/lib/kubelet/pods/69bddfea-c4e4-4776-a5e1-2150121b98a4/volumes" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.531203 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-ovndb-tls-certs\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.531250 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-httpd-config\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.531297 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hckp\" (UniqueName: \"kubernetes.io/projected/cf14cca6-0927-4a49-9c3b-70dc49f21c47-kube-api-access-8hckp\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.531527 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-combined-ca-bundle\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.531657 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-config\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.531709 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-public-tls-certs\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.531834 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-internal-tls-certs\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.605661 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qrzmb"] Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.614430 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qrzmb"] Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.629239 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.634255 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-config\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.634306 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-public-tls-certs\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.634363 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-internal-tls-certs\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.634416 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-ovndb-tls-certs\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.634471 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-httpd-config\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.634504 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hckp\" (UniqueName: \"kubernetes.io/projected/cf14cca6-0927-4a49-9c3b-70dc49f21c47-kube-api-access-8hckp\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.634526 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-combined-ca-bundle\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.638229 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-ovndb-tls-certs\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.638418 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-httpd-config\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.639235 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-combined-ca-bundle\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.642414 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-public-tls-certs\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.647036 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-internal-tls-certs\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.647593 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-config\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.648886 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hckp\" (UniqueName: \"kubernetes.io/projected/cf14cca6-0927-4a49-9c3b-70dc49f21c47-kube-api-access-8hckp\") pod \"neutron-8798f5df-qtg6w\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.710905 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-65f94"] Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.712377 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.717152 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ccmjt" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.717352 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.717567 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.727217 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.727888 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.733939 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-65f94"] Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.734988 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-config-data\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.735016 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-fernet-keys\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.735096 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-scripts\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.735137 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mccqb\" (UniqueName: \"kubernetes.io/projected/18fecb56-f151-4f5a-aac3-30785def9653-kube-api-access-mccqb\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.735160 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-credential-keys\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.735354 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-combined-ca-bundle\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.758668 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.835878 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mccqb\" (UniqueName: \"kubernetes.io/projected/18fecb56-f151-4f5a-aac3-30785def9653-kube-api-access-mccqb\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.835922 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-credential-keys\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.835987 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-combined-ca-bundle\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.836010 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-config-data\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.836024 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-fernet-keys\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.836079 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-scripts\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.840321 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-config-data\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.843403 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-fernet-keys\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.843718 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-credential-keys\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.849167 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-combined-ca-bundle\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.854189 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mccqb\" (UniqueName: \"kubernetes.io/projected/18fecb56-f151-4f5a-aac3-30785def9653-kube-api-access-mccqb\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:07 crc kubenswrapper[4475]: I1203 07:00:07.856033 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-scripts\") pod \"keystone-bootstrap-65f94\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:08 crc kubenswrapper[4475]: I1203 07:00:08.039578 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:08 crc kubenswrapper[4475]: E1203 07:00:08.609473 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-placement-api:65066e8ca260a75886ae57f157049605" Dec 03 07:00:08 crc kubenswrapper[4475]: E1203 07:00:08.609812 4475 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-placement-api:65066e8ca260a75886ae57f157049605" Dec 03 07:00:08 crc kubenswrapper[4475]: E1203 07:00:08.609927 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-placement-api:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lggs4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-pqss2_openstack(50149f3f-08c3-4fd9-9590-b13fcd787897): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:00:08 crc kubenswrapper[4475]: E1203 07:00:08.611555 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-pqss2" podUID="50149f3f-08c3-4fd9-9590-b13fcd787897" Dec 03 07:00:08 crc kubenswrapper[4475]: E1203 07:00:08.974316 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-placement-api:65066e8ca260a75886ae57f157049605\\\"\"" pod="openstack/placement-db-sync-pqss2" podUID="50149f3f-08c3-4fd9-9590-b13fcd787897" Dec 03 07:00:09 crc kubenswrapper[4475]: I1203 07:00:09.502306 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="875f3e6c-4ccd-4b70-8ea6-c5d76eedc268" path="/var/lib/kubelet/pods/875f3e6c-4ccd-4b70-8ea6-c5d76eedc268/volumes" Dec 03 07:00:12 crc kubenswrapper[4475]: E1203 07:00:12.003055 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:65066e8ca260a75886ae57f157049605" Dec 03 07:00:12 crc kubenswrapper[4475]: E1203 07:00:12.003397 4475 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:65066e8ca260a75886ae57f157049605" Dec 03 07:00:12 crc kubenswrapper[4475]: E1203 07:00:12.003519 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n687h5h5b9h65h8dh5b9h54fhcchc9h54dh5c5h66h5bchfbhd9h548hb9hfch95h7dhfchdfh9bhd6hb9h74h698h95hddh665h5h554q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bvc89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-dd8b5f859-x5jv6_openstack(d172b2a3-c6bd-424e-8f33-25a45263d546): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:00:12 crc kubenswrapper[4475]: E1203 07:00:12.007555 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:65066e8ca260a75886ae57f157049605" Dec 03 07:00:12 crc kubenswrapper[4475]: E1203 07:00:12.007580 4475 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:65066e8ca260a75886ae57f157049605" Dec 03 07:00:12 crc kubenswrapper[4475]: E1203 07:00:12.007655 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n575h75hb5hf5h654h549h8hb5h57fh598h698h5c5h5b5h68h676h699h675hd5h655h64fh67dh589h544hcfh669h668h9h556h5c8h648h574h54dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2w6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-77ccb6ff57-mkx4c_openstack(d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:00:12 crc kubenswrapper[4475]: E1203 07:00:12.009505 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:65066e8ca260a75886ae57f157049605\\\"\"]" pod="openstack/horizon-dd8b5f859-x5jv6" podUID="d172b2a3-c6bd-424e-8f33-25a45263d546" Dec 03 07:00:12 crc kubenswrapper[4475]: E1203 07:00:12.009575 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:65066e8ca260a75886ae57f157049605\\\"\"]" pod="openstack/horizon-77ccb6ff57-mkx4c" podUID="d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8" Dec 03 07:00:15 crc kubenswrapper[4475]: I1203 07:00:15.846871 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" podUID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: i/o timeout" Dec 03 07:00:15 crc kubenswrapper[4475]: I1203 07:00:15.847416 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.501317 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-glm7p"] Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.503320 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.514461 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-glm7p"] Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.547830 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-utilities\") pod \"certified-operators-glm7p\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.547892 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsdkd\" (UniqueName: \"kubernetes.io/projected/905141fd-3de4-45c4-bffb-45934f8ea6d3-kube-api-access-zsdkd\") pod \"certified-operators-glm7p\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.548136 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-catalog-content\") pod \"certified-operators-glm7p\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.649366 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-catalog-content\") pod \"certified-operators-glm7p\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.649463 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-utilities\") pod \"certified-operators-glm7p\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.649495 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsdkd\" (UniqueName: \"kubernetes.io/projected/905141fd-3de4-45c4-bffb-45934f8ea6d3-kube-api-access-zsdkd\") pod \"certified-operators-glm7p\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.649885 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-utilities\") pod \"certified-operators-glm7p\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.649920 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-catalog-content\") pod \"certified-operators-glm7p\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.664399 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsdkd\" (UniqueName: \"kubernetes.io/projected/905141fd-3de4-45c4-bffb-45934f8ea6d3-kube-api-access-zsdkd\") pod \"certified-operators-glm7p\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:19 crc kubenswrapper[4475]: I1203 07:00:19.816581 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:20 crc kubenswrapper[4475]: E1203 07:00:20.731477 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-barbican-api:65066e8ca260a75886ae57f157049605" Dec 03 07:00:20 crc kubenswrapper[4475]: E1203 07:00:20.731522 4475 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-barbican-api:65066e8ca260a75886ae57f157049605" Dec 03 07:00:20 crc kubenswrapper[4475]: E1203 07:00:20.731611 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-barbican-api:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l67b7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-zch8h_openstack(8c7df369-c49c-4d2a-842a-a8bd41944f1b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:00:20 crc kubenswrapper[4475]: E1203 07:00:20.732764 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-zch8h" podUID="8c7df369-c49c-4d2a-842a-a8bd41944f1b" Dec 03 07:00:20 crc kubenswrapper[4475]: I1203 07:00:20.803021 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 07:00:20 crc kubenswrapper[4475]: I1203 07:00:20.848269 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" podUID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: i/o timeout" Dec 03 07:00:20 crc kubenswrapper[4475]: I1203 07:00:20.967829 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xftn2\" (UniqueName: \"kubernetes.io/projected/87fdba6a-e11c-4bbd-becc-78999065efa8-kube-api-access-xftn2\") pod \"87fdba6a-e11c-4bbd-becc-78999065efa8\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " Dec 03 07:00:20 crc kubenswrapper[4475]: I1203 07:00:20.967866 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-sb\") pod \"87fdba6a-e11c-4bbd-becc-78999065efa8\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " Dec 03 07:00:20 crc kubenswrapper[4475]: I1203 07:00:20.967888 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-config\") pod \"87fdba6a-e11c-4bbd-becc-78999065efa8\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " Dec 03 07:00:20 crc kubenswrapper[4475]: I1203 07:00:20.967934 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-dns-svc\") pod \"87fdba6a-e11c-4bbd-becc-78999065efa8\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " Dec 03 07:00:20 crc kubenswrapper[4475]: I1203 07:00:20.967956 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-nb\") pod \"87fdba6a-e11c-4bbd-becc-78999065efa8\" (UID: \"87fdba6a-e11c-4bbd-becc-78999065efa8\") " Dec 03 07:00:20 crc kubenswrapper[4475]: I1203 07:00:20.980479 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87fdba6a-e11c-4bbd-becc-78999065efa8-kube-api-access-xftn2" (OuterVolumeSpecName: "kube-api-access-xftn2") pod "87fdba6a-e11c-4bbd-becc-78999065efa8" (UID: "87fdba6a-e11c-4bbd-becc-78999065efa8"). InnerVolumeSpecName "kube-api-access-xftn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.000955 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87fdba6a-e11c-4bbd-becc-78999065efa8" (UID: "87fdba6a-e11c-4bbd-becc-78999065efa8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.001175 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-config" (OuterVolumeSpecName: "config") pod "87fdba6a-e11c-4bbd-becc-78999065efa8" (UID: "87fdba6a-e11c-4bbd-becc-78999065efa8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.002325 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87fdba6a-e11c-4bbd-becc-78999065efa8" (UID: "87fdba6a-e11c-4bbd-becc-78999065efa8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.005124 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87fdba6a-e11c-4bbd-becc-78999065efa8" (UID: "87fdba6a-e11c-4bbd-becc-78999065efa8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:21 crc kubenswrapper[4475]: E1203 07:00:21.016194 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:65066e8ca260a75886ae57f157049605" Dec 03 07:00:21 crc kubenswrapper[4475]: E1203 07:00:21.016233 4475 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:65066e8ca260a75886ae57f157049605" Dec 03 07:00:21 crc kubenswrapper[4475]: E1203 07:00:21.016317 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dktz8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-bxx54_openstack(7105b12e-7df5-42e5-b0cc-27ea52ea7b1c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:00:21 crc kubenswrapper[4475]: E1203 07:00:21.017587 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-bxx54" podUID="7105b12e-7df5-42e5-b0cc-27ea52ea7b1c" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.069526 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77ccb6ff57-mkx4c" event={"ID":"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8","Type":"ContainerDied","Data":"c65d12684da5d871613d9cf414e7387b85e668d7136a65bb2a3ee2b2babe9268"} Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.069560 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c65d12684da5d871613d9cf414e7387b85e668d7136a65bb2a3ee2b2babe9268" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.070104 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xftn2\" (UniqueName: \"kubernetes.io/projected/87fdba6a-e11c-4bbd-becc-78999065efa8-kube-api-access-xftn2\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.070120 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.070129 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.070136 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.070143 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87fdba6a-e11c-4bbd-becc-78999065efa8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.071373 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.071422 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6476ddd6b5-ppjlc" event={"ID":"87fdba6a-e11c-4bbd-becc-78999065efa8","Type":"ContainerDied","Data":"b1c7348b3dcf43bba141fa6fbd5a70c4f964c989b2fad34d59d86a04a413bc3a"} Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.072036 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 07:00:21 crc kubenswrapper[4475]: E1203 07:00:21.072317 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:65066e8ca260a75886ae57f157049605\\\"\"" pod="openstack/heat-db-sync-bxx54" podUID="7105b12e-7df5-42e5-b0cc-27ea52ea7b1c" Dec 03 07:00:21 crc kubenswrapper[4475]: E1203 07:00:21.072522 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-barbican-api:65066e8ca260a75886ae57f157049605\\\"\"" pod="openstack/barbican-db-sync-zch8h" podUID="8c7df369-c49c-4d2a-842a-a8bd41944f1b" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.125885 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6476ddd6b5-ppjlc"] Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.131576 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6476ddd6b5-ppjlc"] Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.273337 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2w6g\" (UniqueName: \"kubernetes.io/projected/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-kube-api-access-v2w6g\") pod \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.273386 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-scripts\") pod \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.273424 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-config-data\") pod \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.273492 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-horizon-secret-key\") pod \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.273668 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-logs\") pod \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\" (UID: \"d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8\") " Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.274009 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-scripts" (OuterVolumeSpecName: "scripts") pod "d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8" (UID: "d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.274224 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-logs" (OuterVolumeSpecName: "logs") pod "d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8" (UID: "d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.274380 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-config-data" (OuterVolumeSpecName: "config-data") pod "d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8" (UID: "d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.276703 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-kube-api-access-v2w6g" (OuterVolumeSpecName: "kube-api-access-v2w6g") pod "d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8" (UID: "d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8"). InnerVolumeSpecName "kube-api-access-v2w6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.276883 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8" (UID: "d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.374936 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.374972 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2w6g\" (UniqueName: \"kubernetes.io/projected/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-kube-api-access-v2w6g\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.374984 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.374992 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.375001 4475 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.502856 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87fdba6a-e11c-4bbd-becc-78999065efa8" path="/var/lib/kubelet/pods/87fdba6a-e11c-4bbd-becc-78999065efa8/volumes" Dec 03 07:00:21 crc kubenswrapper[4475]: E1203 07:00:21.962093 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:65066e8ca260a75886ae57f157049605" Dec 03 07:00:21 crc kubenswrapper[4475]: E1203 07:00:21.962302 4475 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:65066e8ca260a75886ae57f157049605" Dec 03 07:00:21 crc kubenswrapper[4475]: E1203 07:00:21.962438 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hbqln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6ntz5_openstack(2a298c73-a9bf-496a-9192-dcbf3e2417cd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:00:21 crc kubenswrapper[4475]: E1203 07:00:21.963688 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6ntz5" podUID="2a298c73-a9bf-496a-9192-dcbf3e2417cd" Dec 03 07:00:21 crc kubenswrapper[4475]: I1203 07:00:21.990502 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.080246 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dd8b5f859-x5jv6" event={"ID":"d172b2a3-c6bd-424e-8f33-25a45263d546","Type":"ContainerDied","Data":"99c2bcf8740825692a6187b5055a337b74d164691d907c9d22d3dafef643f26f"} Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.080338 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dd8b5f859-x5jv6" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.080363 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77ccb6ff57-mkx4c" Dec 03 07:00:22 crc kubenswrapper[4475]: E1203 07:00:22.085267 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:65066e8ca260a75886ae57f157049605\\\"\"" pod="openstack/cinder-db-sync-6ntz5" podUID="2a298c73-a9bf-496a-9192-dcbf3e2417cd" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.132761 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77ccb6ff57-mkx4c"] Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.139176 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-77ccb6ff57-mkx4c"] Dec 03 07:00:22 crc kubenswrapper[4475]: E1203 07:00:22.164998 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-notification:65066e8ca260a75886ae57f157049605" Dec 03 07:00:22 crc kubenswrapper[4475]: E1203 07:00:22.165036 4475 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-notification:65066e8ca260a75886ae57f157049605" Dec 03 07:00:22 crc kubenswrapper[4475]: E1203 07:00:22.165144 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-notification:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n677h596h97h688h6bhfbh84hcdh54bh689h65fh565h5d9h5ddh5bhdhc8hb9h5f8h5ffh5c4h59dhd9h5c8h676h58ch58fh547h5dbhd8hf7h5f5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kn5xv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(12f47969-3169-43bc-8d07-cbd3952d81cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.186534 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvc89\" (UniqueName: \"kubernetes.io/projected/d172b2a3-c6bd-424e-8f33-25a45263d546-kube-api-access-bvc89\") pod \"d172b2a3-c6bd-424e-8f33-25a45263d546\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.186584 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d172b2a3-c6bd-424e-8f33-25a45263d546-logs\") pod \"d172b2a3-c6bd-424e-8f33-25a45263d546\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.186705 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d172b2a3-c6bd-424e-8f33-25a45263d546-horizon-secret-key\") pod \"d172b2a3-c6bd-424e-8f33-25a45263d546\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.186820 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-scripts\") pod \"d172b2a3-c6bd-424e-8f33-25a45263d546\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.186854 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-config-data\") pod \"d172b2a3-c6bd-424e-8f33-25a45263d546\" (UID: \"d172b2a3-c6bd-424e-8f33-25a45263d546\") " Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.187742 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d172b2a3-c6bd-424e-8f33-25a45263d546-logs" (OuterVolumeSpecName: "logs") pod "d172b2a3-c6bd-424e-8f33-25a45263d546" (UID: "d172b2a3-c6bd-424e-8f33-25a45263d546"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.187877 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d172b2a3-c6bd-424e-8f33-25a45263d546-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.188400 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-config-data" (OuterVolumeSpecName: "config-data") pod "d172b2a3-c6bd-424e-8f33-25a45263d546" (UID: "d172b2a3-c6bd-424e-8f33-25a45263d546"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.189309 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-scripts" (OuterVolumeSpecName: "scripts") pod "d172b2a3-c6bd-424e-8f33-25a45263d546" (UID: "d172b2a3-c6bd-424e-8f33-25a45263d546"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.190579 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d172b2a3-c6bd-424e-8f33-25a45263d546-kube-api-access-bvc89" (OuterVolumeSpecName: "kube-api-access-bvc89") pod "d172b2a3-c6bd-424e-8f33-25a45263d546" (UID: "d172b2a3-c6bd-424e-8f33-25a45263d546"). InnerVolumeSpecName "kube-api-access-bvc89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.192018 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d172b2a3-c6bd-424e-8f33-25a45263d546-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d172b2a3-c6bd-424e-8f33-25a45263d546" (UID: "d172b2a3-c6bd-424e-8f33-25a45263d546"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.193640 4475 scope.go:117] "RemoveContainer" containerID="11147d68d46c01522e43ee435bcfcdeaa53c9bf1705ddff27409d7f60fcf3505" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.297535 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvc89\" (UniqueName: \"kubernetes.io/projected/d172b2a3-c6bd-424e-8f33-25a45263d546-kube-api-access-bvc89\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.297711 4475 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d172b2a3-c6bd-424e-8f33-25a45263d546-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.297722 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.297729 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d172b2a3-c6bd-424e-8f33-25a45263d546-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.368909 4475 scope.go:117] "RemoveContainer" containerID="c9d9f24e460f40acbdabd6a1c9925db16f56c53012d32344327cd5e2a4943c14" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.447809 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dd8b5f859-x5jv6"] Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.497406 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-dd8b5f859-x5jv6"] Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.516818 4475 scope.go:117] "RemoveContainer" containerID="d949ab98dfd9a323f3d5eb5ecfde99a9124e11883d30d48d6554020c1d19f95e" Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.677652 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-695fd7c4bb-h85zh"] Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.682609 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fc4d79b88-s8hhg"] Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.902730 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl"] Dec 03 07:00:22 crc kubenswrapper[4475]: I1203 07:00:22.975805 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.017872 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bk4n9"] Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.031589 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c7bf79b95-knv75"] Dec 03 07:00:23 crc kubenswrapper[4475]: W1203 07:00:23.059353 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd709367_66b2_4586_b3ba_424d4c1532ee.slice/crio-7156319f63f2ea4fbb7acd862764bb6b088a47ff06d9e5d9c4af50d1d1a7939c WatchSource:0}: Error finding container 7156319f63f2ea4fbb7acd862764bb6b088a47ff06d9e5d9c4af50d1d1a7939c: Status 404 returned error can't find the container with id 7156319f63f2ea4fbb7acd862764bb6b088a47ff06d9e5d9c4af50d1d1a7939c Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.095189 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" event={"ID":"f1be585f-cb41-49eb-83b7-9c25c757c739","Type":"ContainerStarted","Data":"ff9956979f7edeb7e5276a5a737ccca2b4a4b60a38a8ef787caee182c2db46d2"} Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.099520 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db4d85d89-t96xs" event={"ID":"e786a238-51fe-464f-bcc8-54d35b24e9cf","Type":"ContainerStarted","Data":"ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb"} Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.099558 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.099575 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db4d85d89-t96xs" event={"ID":"e786a238-51fe-464f-bcc8-54d35b24e9cf","Type":"ContainerStarted","Data":"5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db"} Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.099571 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5db4d85d89-t96xs" podUID="e786a238-51fe-464f-bcc8-54d35b24e9cf" containerName="horizon-log" containerID="cri-o://5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db" gracePeriod=30 Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.099641 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5db4d85d89-t96xs" podUID="e786a238-51fe-464f-bcc8-54d35b24e9cf" containerName="horizon" containerID="cri-o://ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb" gracePeriod=30 Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.106564 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fc4d79b88-s8hhg" event={"ID":"2401beb9-38b8-4581-b9a2-8bb16e15e6c1","Type":"ContainerStarted","Data":"73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02"} Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.106590 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fc4d79b88-s8hhg" event={"ID":"2401beb9-38b8-4581-b9a2-8bb16e15e6c1","Type":"ContainerStarted","Data":"9e9d970ff8e874ce94e3c25e67e9d19e801b3a9b04bd0023ba3c2afbbc072fa6"} Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.110917 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-695fd7c4bb-h85zh" event={"ID":"1f10037d-d109-4863-b89f-c33f39b0848a","Type":"ContainerStarted","Data":"12d71d60d32df97c1bf16bc943ff61a4eed0438f40a71325111f5e7fe725b5cc"} Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.110957 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-695fd7c4bb-h85zh" event={"ID":"1f10037d-d109-4863-b89f-c33f39b0848a","Type":"ContainerStarted","Data":"466f91eb63d61f67d90e86d2514eea81dd9f71744053d0a1fbc4d798390379c1"} Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.113213 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pqss2" event={"ID":"50149f3f-08c3-4fd9-9590-b13fcd787897","Type":"ContainerStarted","Data":"4c9e9074106173f250a545c8cbd9265aa037db239d83cf6168e0bcc58ac22e8b"} Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.115536 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2604ecb1-3054-4b63-9a8d-3880ee519c58","Type":"ContainerStarted","Data":"fdf05e9ea21bd26a7cc37d8980e2c33353d700d2d33bfde896506aa85d2d16c3"} Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.116447 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" event={"ID":"9de8472e-1891-4e1c-8db7-fb458c212969","Type":"ContainerStarted","Data":"ebccf5e2166f646a553ecbfbe6ab23e12d9384a517b1b93070397caa4add0df7"} Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.135913 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5db4d85d89-t96xs" podStartSLOduration=4.136059509 podStartE2EDuration="35.135887885s" podCreationTimestamp="2025-12-03 06:59:48 +0000 UTC" firstStartedPulling="2025-12-03 06:59:49.75318511 +0000 UTC m=+874.558083444" lastFinishedPulling="2025-12-03 07:00:20.753013475 +0000 UTC m=+905.557911820" observedRunningTime="2025-12-03 07:00:23.12910703 +0000 UTC m=+907.934005364" watchObservedRunningTime="2025-12-03 07:00:23.135887885 +0000 UTC m=+907.940786219" Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.156226 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk4n9" event={"ID":"cd709367-66b2-4586-b3ba-424d4c1532ee","Type":"ContainerStarted","Data":"7156319f63f2ea4fbb7acd862764bb6b088a47ff06d9e5d9c4af50d1d1a7939c"} Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.158007 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-pqss2" podStartSLOduration=2.454842824 podStartE2EDuration="37.157988958s" podCreationTimestamp="2025-12-03 06:59:46 +0000 UTC" firstStartedPulling="2025-12-03 06:59:47.799776365 +0000 UTC m=+872.604674699" lastFinishedPulling="2025-12-03 07:00:22.5029225 +0000 UTC m=+907.307820833" observedRunningTime="2025-12-03 07:00:23.150283163 +0000 UTC m=+907.955181497" watchObservedRunningTime="2025-12-03 07:00:23.157988958 +0000 UTC m=+907.962887292" Dec 03 07:00:23 crc kubenswrapper[4475]: W1203 07:00:23.168664 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d024ccd_9b9c_4656_a6c9_88c6d524960c.slice/crio-e9acde908258dc1084654ed92bb44f89acd97a97f817a787c21b6bab6e21d48e WatchSource:0}: Error finding container e9acde908258dc1084654ed92bb44f89acd97a97f817a787c21b6bab6e21d48e: Status 404 returned error can't find the container with id e9acde908258dc1084654ed92bb44f89acd97a97f817a787c21b6bab6e21d48e Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.194936 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7546fbf47b-6g4gw"] Dec 03 07:00:23 crc kubenswrapper[4475]: W1203 07:00:23.236042 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd620fdbb_7e93_46f8_95e1_18c9f9aab8f0.slice/crio-cf949f94f3434d42a7e55b3a181e0c5b1dfbe6fe92ada6cbd05bc62d576eff70 WatchSource:0}: Error finding container cf949f94f3434d42a7e55b3a181e0c5b1dfbe6fe92ada6cbd05bc62d576eff70: Status 404 returned error can't find the container with id cf949f94f3434d42a7e55b3a181e0c5b1dfbe6fe92ada6cbd05bc62d576eff70 Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.239960 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-glm7p"] Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.253104 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-65f94"] Dec 03 07:00:23 crc kubenswrapper[4475]: W1203 07:00:23.285380 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod905141fd_3de4_45c4_bffb_45934f8ea6d3.slice/crio-eda29046bc4e3595e4be167a955ef42f5bb0b044e03c77fad16f08627800df10 WatchSource:0}: Error finding container eda29046bc4e3595e4be167a955ef42f5bb0b044e03c77fad16f08627800df10: Status 404 returned error can't find the container with id eda29046bc4e3595e4be167a955ef42f5bb0b044e03c77fad16f08627800df10 Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.306924 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8798f5df-qtg6w"] Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.320435 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.515094 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d172b2a3-c6bd-424e-8f33-25a45263d546" path="/var/lib/kubelet/pods/d172b2a3-c6bd-424e-8f33-25a45263d546/volumes" Dec 03 07:00:23 crc kubenswrapper[4475]: I1203 07:00:23.515695 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8" path="/var/lib/kubelet/pods/d6182da1-5b12-4bb3-af9b-3c8cb8d7ddc8/volumes" Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.179713 4475 generic.go:334] "Generic (PLEG): container finished" podID="905141fd-3de4-45c4-bffb-45934f8ea6d3" containerID="c9efe999bd33dc9f26e7714db719ab40201a623e9049dc9640a740b805625359" exitCode=0 Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.180008 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glm7p" event={"ID":"905141fd-3de4-45c4-bffb-45934f8ea6d3","Type":"ContainerDied","Data":"c9efe999bd33dc9f26e7714db719ab40201a623e9049dc9640a740b805625359"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.180301 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glm7p" event={"ID":"905141fd-3de4-45c4-bffb-45934f8ea6d3","Type":"ContainerStarted","Data":"eda29046bc4e3595e4be167a955ef42f5bb0b044e03c77fad16f08627800df10"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.191535 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fc4d79b88-s8hhg" event={"ID":"2401beb9-38b8-4581-b9a2-8bb16e15e6c1","Type":"ContainerStarted","Data":"7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.205753 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8798f5df-qtg6w" event={"ID":"cf14cca6-0927-4a49-9c3b-70dc49f21c47","Type":"ContainerStarted","Data":"440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.205826 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8798f5df-qtg6w" event={"ID":"cf14cca6-0927-4a49-9c3b-70dc49f21c47","Type":"ContainerStarted","Data":"1c08b1cd2617a128945439f98e79d34b60a3d1a3c4d829363e8f3d30574b9b79"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.207711 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7546fbf47b-6g4gw" event={"ID":"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0","Type":"ContainerStarted","Data":"cf949f94f3434d42a7e55b3a181e0c5b1dfbe6fe92ada6cbd05bc62d576eff70"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.211102 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d024ccd-9b9c-4656-a6c9-88c6d524960c","Type":"ContainerStarted","Data":"e9acde908258dc1084654ed92bb44f89acd97a97f817a787c21b6bab6e21d48e"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.224583 4475 generic.go:334] "Generic (PLEG): container finished" podID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerID="ab67e3437a2127861169da71e13cbf727667933c83585ef921517ca855c8e2c9" exitCode=0 Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.224652 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk4n9" event={"ID":"cd709367-66b2-4586-b3ba-424d4c1532ee","Type":"ContainerDied","Data":"ab67e3437a2127861169da71e13cbf727667933c83585ef921517ca855c8e2c9"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.226016 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7fc4d79b88-s8hhg" podStartSLOduration=30.226003149 podStartE2EDuration="30.226003149s" podCreationTimestamp="2025-12-03 06:59:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:24.221337663 +0000 UTC m=+909.026235997" watchObservedRunningTime="2025-12-03 07:00:24.226003149 +0000 UTC m=+909.030901483" Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.244370 4475 generic.go:334] "Generic (PLEG): container finished" podID="f1be585f-cb41-49eb-83b7-9c25c757c739" containerID="69018c9dad70106cc7bbbad69455a45b3a59a6fa572c73a1e224fda5f46821cc" exitCode=0 Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.244419 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" event={"ID":"f1be585f-cb41-49eb-83b7-9c25c757c739","Type":"ContainerDied","Data":"69018c9dad70106cc7bbbad69455a45b3a59a6fa572c73a1e224fda5f46821cc"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.254131 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-65f94" event={"ID":"18fecb56-f151-4f5a-aac3-30785def9653","Type":"ContainerStarted","Data":"528ff7750f99b9123bf14682a7f8718ad6b933e2763eda799613807008d287f5"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.267857 4475 generic.go:334] "Generic (PLEG): container finished" podID="9de8472e-1891-4e1c-8db7-fb458c212969" containerID="0c66f8f84c01b8f5a3385cd650dba391c2dde27470dadf4f5c132b51254c7071" exitCode=0 Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.267953 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" event={"ID":"9de8472e-1891-4e1c-8db7-fb458c212969","Type":"ContainerDied","Data":"0c66f8f84c01b8f5a3385cd650dba391c2dde27470dadf4f5c132b51254c7071"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.282571 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-695fd7c4bb-h85zh" event={"ID":"1f10037d-d109-4863-b89f-c33f39b0848a","Type":"ContainerStarted","Data":"946030133b46f679ceb6a10d39171053643a81cb5786707a8f48ddc2e86bcdc5"} Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.317792 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-695fd7c4bb-h85zh" podStartSLOduration=30.317776387 podStartE2EDuration="30.317776387s" podCreationTimestamp="2025-12-03 06:59:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:24.306281205 +0000 UTC m=+909.111179539" watchObservedRunningTime="2025-12-03 07:00:24.317776387 +0000 UTC m=+909.122674722" Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.824578 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 07:00:24 crc kubenswrapper[4475]: I1203 07:00:24.824867 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.032777 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.033158 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.308005 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" event={"ID":"f1be585f-cb41-49eb-83b7-9c25c757c739","Type":"ContainerStarted","Data":"1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec"} Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.309051 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.311800 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-65f94" event={"ID":"18fecb56-f151-4f5a-aac3-30785def9653","Type":"ContainerStarted","Data":"74038a224524ed26b69b79829df0410aa86f857470a3f1410e97408d05a1f23c"} Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.315405 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8798f5df-qtg6w" event={"ID":"cf14cca6-0927-4a49-9c3b-70dc49f21c47","Type":"ContainerStarted","Data":"3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e"} Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.315907 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.317292 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7546fbf47b-6g4gw" event={"ID":"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0","Type":"ContainerStarted","Data":"e7c8adddfe68e490bef6ff8bf6eb5ad35a21fe979145033da3306295ded650f6"} Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.317317 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7546fbf47b-6g4gw" event={"ID":"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0","Type":"ContainerStarted","Data":"1c4da307c0c467a21013373e22ee9707f5603d2eae92ff464829b1780262ed39"} Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.317705 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.338325 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" podStartSLOduration=20.338312851 podStartE2EDuration="20.338312851s" podCreationTimestamp="2025-12-03 07:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:25.326573829 +0000 UTC m=+910.131472163" watchObservedRunningTime="2025-12-03 07:00:25.338312851 +0000 UTC m=+910.143211185" Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.348125 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d024ccd-9b9c-4656-a6c9-88c6d524960c","Type":"ContainerStarted","Data":"0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c"} Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.353380 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8798f5df-qtg6w" podStartSLOduration=18.353371287 podStartE2EDuration="18.353371287s" podCreationTimestamp="2025-12-03 07:00:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:25.341894538 +0000 UTC m=+910.146792872" watchObservedRunningTime="2025-12-03 07:00:25.353371287 +0000 UTC m=+910.158269621" Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.370634 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-65f94" podStartSLOduration=18.370623147 podStartE2EDuration="18.370623147s" podCreationTimestamp="2025-12-03 07:00:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:25.365379675 +0000 UTC m=+910.170278009" watchObservedRunningTime="2025-12-03 07:00:25.370623147 +0000 UTC m=+910.175521482" Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.394097 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7546fbf47b-6g4gw" podStartSLOduration=20.394086813 podStartE2EDuration="20.394086813s" podCreationTimestamp="2025-12-03 07:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:25.392115566 +0000 UTC m=+910.197013910" watchObservedRunningTime="2025-12-03 07:00:25.394086813 +0000 UTC m=+910.198985148" Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.401494 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2604ecb1-3054-4b63-9a8d-3880ee519c58","Type":"ContainerStarted","Data":"4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c"} Dec 03 07:00:25 crc kubenswrapper[4475]: E1203 07:00:25.751757 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod905141fd_3de4_45c4_bffb_45934f8ea6d3.slice/crio-4823326e9a8f847ed1956d42ba0df515a55060f14fecf121285347dba1e9db5f.scope\": RecentStats: unable to find data in memory cache]" Dec 03 07:00:25 crc kubenswrapper[4475]: I1203 07:00:25.994660 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.151430 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9de8472e-1891-4e1c-8db7-fb458c212969-secret-volume\") pod \"9de8472e-1891-4e1c-8db7-fb458c212969\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.151493 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cg2h\" (UniqueName: \"kubernetes.io/projected/9de8472e-1891-4e1c-8db7-fb458c212969-kube-api-access-9cg2h\") pod \"9de8472e-1891-4e1c-8db7-fb458c212969\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.151530 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9de8472e-1891-4e1c-8db7-fb458c212969-config-volume\") pod \"9de8472e-1891-4e1c-8db7-fb458c212969\" (UID: \"9de8472e-1891-4e1c-8db7-fb458c212969\") " Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.152297 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de8472e-1891-4e1c-8db7-fb458c212969-config-volume" (OuterVolumeSpecName: "config-volume") pod "9de8472e-1891-4e1c-8db7-fb458c212969" (UID: "9de8472e-1891-4e1c-8db7-fb458c212969"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.160272 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de8472e-1891-4e1c-8db7-fb458c212969-kube-api-access-9cg2h" (OuterVolumeSpecName: "kube-api-access-9cg2h") pod "9de8472e-1891-4e1c-8db7-fb458c212969" (UID: "9de8472e-1891-4e1c-8db7-fb458c212969"). InnerVolumeSpecName "kube-api-access-9cg2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.175633 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de8472e-1891-4e1c-8db7-fb458c212969-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9de8472e-1891-4e1c-8db7-fb458c212969" (UID: "9de8472e-1891-4e1c-8db7-fb458c212969"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.253528 4475 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9de8472e-1891-4e1c-8db7-fb458c212969-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.253564 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cg2h\" (UniqueName: \"kubernetes.io/projected/9de8472e-1891-4e1c-8db7-fb458c212969-kube-api-access-9cg2h\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.253580 4475 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9de8472e-1891-4e1c-8db7-fb458c212969-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.417050 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d024ccd-9b9c-4656-a6c9-88c6d524960c","Type":"ContainerStarted","Data":"0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184"} Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.430561 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk4n9" event={"ID":"cd709367-66b2-4586-b3ba-424d4c1532ee","Type":"ContainerStarted","Data":"fd967aeb76da537671c4a993553a49caeb764bac88e302799f05bec146346a03"} Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.435864 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2604ecb1-3054-4b63-9a8d-3880ee519c58","Type":"ContainerStarted","Data":"9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd"} Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.442667 4475 generic.go:334] "Generic (PLEG): container finished" podID="905141fd-3de4-45c4-bffb-45934f8ea6d3" containerID="4823326e9a8f847ed1956d42ba0df515a55060f14fecf121285347dba1e9db5f" exitCode=0 Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.442731 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glm7p" event={"ID":"905141fd-3de4-45c4-bffb-45934f8ea6d3","Type":"ContainerDied","Data":"4823326e9a8f847ed1956d42ba0df515a55060f14fecf121285347dba1e9db5f"} Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.450563 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=20.450543023 podStartE2EDuration="20.450543023s" podCreationTimestamp="2025-12-03 07:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:26.440952233 +0000 UTC m=+911.245850567" watchObservedRunningTime="2025-12-03 07:00:26.450543023 +0000 UTC m=+911.255441357" Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.451714 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" event={"ID":"9de8472e-1891-4e1c-8db7-fb458c212969","Type":"ContainerDied","Data":"ebccf5e2166f646a553ecbfbe6ab23e12d9384a517b1b93070397caa4add0df7"} Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.451736 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebccf5e2166f646a553ecbfbe6ab23e12d9384a517b1b93070397caa4add0df7" Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.451759 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl" Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.459369 4475 generic.go:334] "Generic (PLEG): container finished" podID="50149f3f-08c3-4fd9-9590-b13fcd787897" containerID="4c9e9074106173f250a545c8cbd9265aa037db239d83cf6168e0bcc58ac22e8b" exitCode=0 Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.459544 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pqss2" event={"ID":"50149f3f-08c3-4fd9-9590-b13fcd787897","Type":"ContainerDied","Data":"4c9e9074106173f250a545c8cbd9265aa037db239d83cf6168e0bcc58ac22e8b"} Dec 03 07:00:26 crc kubenswrapper[4475]: I1203 07:00:26.470122 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=31.47011014 podStartE2EDuration="31.47011014s" podCreationTimestamp="2025-12-03 06:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:26.466748857 +0000 UTC m=+911.271647191" watchObservedRunningTime="2025-12-03 07:00:26.47011014 +0000 UTC m=+911.275008474" Dec 03 07:00:27 crc kubenswrapper[4475]: I1203 07:00:27.643844 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 07:00:27 crc kubenswrapper[4475]: I1203 07:00:27.646539 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 07:00:27 crc kubenswrapper[4475]: I1203 07:00:27.723562 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 07:00:27 crc kubenswrapper[4475]: I1203 07:00:27.743836 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 07:00:28 crc kubenswrapper[4475]: I1203 07:00:28.479322 4475 generic.go:334] "Generic (PLEG): container finished" podID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerID="fd967aeb76da537671c4a993553a49caeb764bac88e302799f05bec146346a03" exitCode=0 Dec 03 07:00:28 crc kubenswrapper[4475]: I1203 07:00:28.479385 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk4n9" event={"ID":"cd709367-66b2-4586-b3ba-424d4c1532ee","Type":"ContainerDied","Data":"fd967aeb76da537671c4a993553a49caeb764bac88e302799f05bec146346a03"} Dec 03 07:00:28 crc kubenswrapper[4475]: I1203 07:00:28.479839 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 07:00:28 crc kubenswrapper[4475]: I1203 07:00:28.479857 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 07:00:28 crc kubenswrapper[4475]: I1203 07:00:28.786542 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 07:00:29 crc kubenswrapper[4475]: I1203 07:00:29.490292 4475 generic.go:334] "Generic (PLEG): container finished" podID="18fecb56-f151-4f5a-aac3-30785def9653" containerID="74038a224524ed26b69b79829df0410aa86f857470a3f1410e97408d05a1f23c" exitCode=0 Dec 03 07:00:29 crc kubenswrapper[4475]: I1203 07:00:29.490339 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-65f94" event={"ID":"18fecb56-f151-4f5a-aac3-30785def9653","Type":"ContainerDied","Data":"74038a224524ed26b69b79829df0410aa86f857470a3f1410e97408d05a1f23c"} Dec 03 07:00:30 crc kubenswrapper[4475]: I1203 07:00:30.465318 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:30 crc kubenswrapper[4475]: I1203 07:00:30.503254 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glm7p" event={"ID":"905141fd-3de4-45c4-bffb-45934f8ea6d3","Type":"ContainerStarted","Data":"96855b91ce5cbe19b39e5ab377d1507d6946626aee1939449d754538b3cb8d8a"} Dec 03 07:00:30 crc kubenswrapper[4475]: I1203 07:00:30.524954 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57dbb9b85f-9wjp5"] Dec 03 07:00:30 crc kubenswrapper[4475]: I1203 07:00:30.526171 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" podUID="65d0b806-aeac-4b82-9792-32e5b25e3c3e" containerName="dnsmasq-dns" containerID="cri-o://ca8cb57248784eb1f8be0620f71695deeada457e8a3591dfe847ccf06804ae7a" gracePeriod=10 Dec 03 07:00:30 crc kubenswrapper[4475]: I1203 07:00:30.553538 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-glm7p" podStartSLOduration=8.685249919 podStartE2EDuration="11.553522804s" podCreationTimestamp="2025-12-03 07:00:19 +0000 UTC" firstStartedPulling="2025-12-03 07:00:24.182914106 +0000 UTC m=+908.987812440" lastFinishedPulling="2025-12-03 07:00:27.051186991 +0000 UTC m=+911.856085325" observedRunningTime="2025-12-03 07:00:30.548062153 +0000 UTC m=+915.352960487" watchObservedRunningTime="2025-12-03 07:00:30.553522804 +0000 UTC m=+915.358421139" Dec 03 07:00:31 crc kubenswrapper[4475]: I1203 07:00:31.536727 4475 generic.go:334] "Generic (PLEG): container finished" podID="65d0b806-aeac-4b82-9792-32e5b25e3c3e" containerID="ca8cb57248784eb1f8be0620f71695deeada457e8a3591dfe847ccf06804ae7a" exitCode=0 Dec 03 07:00:31 crc kubenswrapper[4475]: I1203 07:00:31.539151 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" event={"ID":"65d0b806-aeac-4b82-9792-32e5b25e3c3e","Type":"ContainerDied","Data":"ca8cb57248784eb1f8be0620f71695deeada457e8a3591dfe847ccf06804ae7a"} Dec 03 07:00:31 crc kubenswrapper[4475]: I1203 07:00:31.885992 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" podUID="65d0b806-aeac-4b82-9792-32e5b25e3c3e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: connect: connection refused" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.362444 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.465290 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pqss2" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.483954 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mccqb\" (UniqueName: \"kubernetes.io/projected/18fecb56-f151-4f5a-aac3-30785def9653-kube-api-access-mccqb\") pod \"18fecb56-f151-4f5a-aac3-30785def9653\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.484032 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-scripts\") pod \"18fecb56-f151-4f5a-aac3-30785def9653\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.484078 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-config-data\") pod \"18fecb56-f151-4f5a-aac3-30785def9653\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.484110 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-combined-ca-bundle\") pod \"18fecb56-f151-4f5a-aac3-30785def9653\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.484139 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-fernet-keys\") pod \"18fecb56-f151-4f5a-aac3-30785def9653\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.484179 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-credential-keys\") pod \"18fecb56-f151-4f5a-aac3-30785def9653\" (UID: \"18fecb56-f151-4f5a-aac3-30785def9653\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.539679 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18fecb56-f151-4f5a-aac3-30785def9653-kube-api-access-mccqb" (OuterVolumeSpecName: "kube-api-access-mccqb") pod "18fecb56-f151-4f5a-aac3-30785def9653" (UID: "18fecb56-f151-4f5a-aac3-30785def9653"). InnerVolumeSpecName "kube-api-access-mccqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.539675 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "18fecb56-f151-4f5a-aac3-30785def9653" (UID: "18fecb56-f151-4f5a-aac3-30785def9653"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.559259 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-scripts" (OuterVolumeSpecName: "scripts") pod "18fecb56-f151-4f5a-aac3-30785def9653" (UID: "18fecb56-f151-4f5a-aac3-30785def9653"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.574883 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "18fecb56-f151-4f5a-aac3-30785def9653" (UID: "18fecb56-f151-4f5a-aac3-30785def9653"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.581947 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pqss2" event={"ID":"50149f3f-08c3-4fd9-9590-b13fcd787897","Type":"ContainerDied","Data":"a9caee36d7d17b77bd1ddce1592d22d4fa501c63bad4f8f88dbeef6455bae716"} Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.582081 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9caee36d7d17b77bd1ddce1592d22d4fa501c63bad4f8f88dbeef6455bae716" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.582710 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pqss2" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.587407 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-config-data\") pod \"50149f3f-08c3-4fd9-9590-b13fcd787897\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.587583 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-combined-ca-bundle\") pod \"50149f3f-08c3-4fd9-9590-b13fcd787897\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.587669 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50149f3f-08c3-4fd9-9590-b13fcd787897-logs\") pod \"50149f3f-08c3-4fd9-9590-b13fcd787897\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.587689 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-scripts\") pod \"50149f3f-08c3-4fd9-9590-b13fcd787897\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.587706 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lggs4\" (UniqueName: \"kubernetes.io/projected/50149f3f-08c3-4fd9-9590-b13fcd787897-kube-api-access-lggs4\") pod \"50149f3f-08c3-4fd9-9590-b13fcd787897\" (UID: \"50149f3f-08c3-4fd9-9590-b13fcd787897\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.588102 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mccqb\" (UniqueName: \"kubernetes.io/projected/18fecb56-f151-4f5a-aac3-30785def9653-kube-api-access-mccqb\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.588119 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.588127 4475 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.588137 4475 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.590216 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50149f3f-08c3-4fd9-9590-b13fcd787897-logs" (OuterVolumeSpecName: "logs") pod "50149f3f-08c3-4fd9-9590-b13fcd787897" (UID: "50149f3f-08c3-4fd9-9590-b13fcd787897"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.595693 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-65f94" event={"ID":"18fecb56-f151-4f5a-aac3-30785def9653","Type":"ContainerDied","Data":"528ff7750f99b9123bf14682a7f8718ad6b933e2763eda799613807008d287f5"} Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.595725 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="528ff7750f99b9123bf14682a7f8718ad6b933e2763eda799613807008d287f5" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.595778 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-65f94" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.596329 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50149f3f-08c3-4fd9-9590-b13fcd787897-kube-api-access-lggs4" (OuterVolumeSpecName: "kube-api-access-lggs4") pod "50149f3f-08c3-4fd9-9590-b13fcd787897" (UID: "50149f3f-08c3-4fd9-9590-b13fcd787897"). InnerVolumeSpecName "kube-api-access-lggs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.599648 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-config-data" (OuterVolumeSpecName: "config-data") pod "18fecb56-f151-4f5a-aac3-30785def9653" (UID: "18fecb56-f151-4f5a-aac3-30785def9653"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.605764 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.616501 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-scripts" (OuterVolumeSpecName: "scripts") pod "50149f3f-08c3-4fd9-9590-b13fcd787897" (UID: "50149f3f-08c3-4fd9-9590-b13fcd787897"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.689821 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-config\") pod \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.689883 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-sb\") pod \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.690033 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5gz2\" (UniqueName: \"kubernetes.io/projected/65d0b806-aeac-4b82-9792-32e5b25e3c3e-kube-api-access-d5gz2\") pod \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.690084 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-nb\") pod \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.690151 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-svc\") pod \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.690220 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-swift-storage-0\") pod \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\" (UID: \"65d0b806-aeac-4b82-9792-32e5b25e3c3e\") " Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.691625 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.691646 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50149f3f-08c3-4fd9-9590-b13fcd787897-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.691655 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.691664 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lggs4\" (UniqueName: \"kubernetes.io/projected/50149f3f-08c3-4fd9-9590-b13fcd787897-kube-api-access-lggs4\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.758053 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65d0b806-aeac-4b82-9792-32e5b25e3c3e-kube-api-access-d5gz2" (OuterVolumeSpecName: "kube-api-access-d5gz2") pod "65d0b806-aeac-4b82-9792-32e5b25e3c3e" (UID: "65d0b806-aeac-4b82-9792-32e5b25e3c3e"). InnerVolumeSpecName "kube-api-access-d5gz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.782811 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50149f3f-08c3-4fd9-9590-b13fcd787897" (UID: "50149f3f-08c3-4fd9-9590-b13fcd787897"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.796931 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5gz2\" (UniqueName: \"kubernetes.io/projected/65d0b806-aeac-4b82-9792-32e5b25e3c3e-kube-api-access-d5gz2\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.796964 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.828468 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18fecb56-f151-4f5a-aac3-30785def9653" (UID: "18fecb56-f151-4f5a-aac3-30785def9653"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.844542 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-config-data" (OuterVolumeSpecName: "config-data") pod "50149f3f-08c3-4fd9-9590-b13fcd787897" (UID: "50149f3f-08c3-4fd9-9590-b13fcd787897"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.900075 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50149f3f-08c3-4fd9-9590-b13fcd787897-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.900106 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fecb56-f151-4f5a-aac3-30785def9653-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.909099 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "65d0b806-aeac-4b82-9792-32e5b25e3c3e" (UID: "65d0b806-aeac-4b82-9792-32e5b25e3c3e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.909883 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "65d0b806-aeac-4b82-9792-32e5b25e3c3e" (UID: "65d0b806-aeac-4b82-9792-32e5b25e3c3e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.912477 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-config" (OuterVolumeSpecName: "config") pod "65d0b806-aeac-4b82-9792-32e5b25e3c3e" (UID: "65d0b806-aeac-4b82-9792-32e5b25e3c3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.912583 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "65d0b806-aeac-4b82-9792-32e5b25e3c3e" (UID: "65d0b806-aeac-4b82-9792-32e5b25e3c3e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:32 crc kubenswrapper[4475]: I1203 07:00:32.914011 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "65d0b806-aeac-4b82-9792-32e5b25e3c3e" (UID: "65d0b806-aeac-4b82-9792-32e5b25e3c3e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.002506 4475 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.002545 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.002556 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.002566 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.002576 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65d0b806-aeac-4b82-9792-32e5b25e3c3e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.224844 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.285609 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.490229 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-97b4f9d66-l5knv"] Dec 03 07:00:33 crc kubenswrapper[4475]: E1203 07:00:33.491114 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65d0b806-aeac-4b82-9792-32e5b25e3c3e" containerName="dnsmasq-dns" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491135 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d0b806-aeac-4b82-9792-32e5b25e3c3e" containerName="dnsmasq-dns" Dec 03 07:00:33 crc kubenswrapper[4475]: E1203 07:00:33.491147 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18fecb56-f151-4f5a-aac3-30785def9653" containerName="keystone-bootstrap" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491156 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="18fecb56-f151-4f5a-aac3-30785def9653" containerName="keystone-bootstrap" Dec 03 07:00:33 crc kubenswrapper[4475]: E1203 07:00:33.491179 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerName="dnsmasq-dns" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491185 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerName="dnsmasq-dns" Dec 03 07:00:33 crc kubenswrapper[4475]: E1203 07:00:33.491195 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65d0b806-aeac-4b82-9792-32e5b25e3c3e" containerName="init" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491201 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d0b806-aeac-4b82-9792-32e5b25e3c3e" containerName="init" Dec 03 07:00:33 crc kubenswrapper[4475]: E1203 07:00:33.491214 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50149f3f-08c3-4fd9-9590-b13fcd787897" containerName="placement-db-sync" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491220 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="50149f3f-08c3-4fd9-9590-b13fcd787897" containerName="placement-db-sync" Dec 03 07:00:33 crc kubenswrapper[4475]: E1203 07:00:33.491234 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de8472e-1891-4e1c-8db7-fb458c212969" containerName="collect-profiles" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491241 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de8472e-1891-4e1c-8db7-fb458c212969" containerName="collect-profiles" Dec 03 07:00:33 crc kubenswrapper[4475]: E1203 07:00:33.491263 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerName="init" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491269 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerName="init" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491512 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de8472e-1891-4e1c-8db7-fb458c212969" containerName="collect-profiles" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491525 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="87fdba6a-e11c-4bbd-becc-78999065efa8" containerName="dnsmasq-dns" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491533 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="50149f3f-08c3-4fd9-9590-b13fcd787897" containerName="placement-db-sync" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491543 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="65d0b806-aeac-4b82-9792-32e5b25e3c3e" containerName="dnsmasq-dns" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.491554 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="18fecb56-f151-4f5a-aac3-30785def9653" containerName="keystone-bootstrap" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.492333 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.496056 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.496091 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.496341 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.496489 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.497079 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.498361 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ccmjt" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.546878 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-97b4f9d66-l5knv"] Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.614222 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12f47969-3169-43bc-8d07-cbd3952d81cf","Type":"ContainerStarted","Data":"f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef"} Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.618889 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk4n9" event={"ID":"cd709367-66b2-4586-b3ba-424d4c1532ee","Type":"ContainerStarted","Data":"ce8ded9ff003a96e9ac2b30b683a2f4293fc745bb0ca965930fe87243b626b07"} Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.626711 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.626867 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dbb9b85f-9wjp5" event={"ID":"65d0b806-aeac-4b82-9792-32e5b25e3c3e","Type":"ContainerDied","Data":"742b10b4188d98fb51e426dda1fbf34e8427dcf4895353b15b524e08fbed911e"} Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.626941 4475 scope.go:117] "RemoveContainer" containerID="ca8cb57248784eb1f8be0620f71695deeada457e8a3591dfe847ccf06804ae7a" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.631353 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-fernet-keys\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.631427 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-combined-ca-bundle\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.631498 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-internal-tls-certs\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.631549 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-credential-keys\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.631568 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-config-data\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.631708 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-public-tls-certs\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.631757 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24th7\" (UniqueName: \"kubernetes.io/projected/283af06a-8c2b-45cf-b134-0403750866b1-kube-api-access-24th7\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.631783 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-scripts\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.638782 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bk4n9" podStartSLOduration=30.607295519 podStartE2EDuration="38.638766921s" podCreationTimestamp="2025-12-03 06:59:55 +0000 UTC" firstStartedPulling="2025-12-03 07:00:24.226579814 +0000 UTC m=+909.031478147" lastFinishedPulling="2025-12-03 07:00:32.258051225 +0000 UTC m=+917.062949549" observedRunningTime="2025-12-03 07:00:33.634256146 +0000 UTC m=+918.439154481" watchObservedRunningTime="2025-12-03 07:00:33.638766921 +0000 UTC m=+918.443665255" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.663958 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57dbb9b85f-9wjp5"] Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.677908 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57dbb9b85f-9wjp5"] Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.736109 4475 scope.go:117] "RemoveContainer" containerID="8454666941e5d042776b8af56b644bdf0ad79cd2839e124a01703c12b23f9c31" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.738777 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-public-tls-certs\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.738850 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-scripts\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.738873 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24th7\" (UniqueName: \"kubernetes.io/projected/283af06a-8c2b-45cf-b134-0403750866b1-kube-api-access-24th7\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.738940 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-fernet-keys\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.738992 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-combined-ca-bundle\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.739055 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-internal-tls-certs\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.739124 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-credential-keys\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.739146 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-config-data\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.752765 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-config-data\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.753144 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-public-tls-certs\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.777773 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24th7\" (UniqueName: \"kubernetes.io/projected/283af06a-8c2b-45cf-b134-0403750866b1-kube-api-access-24th7\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.794654 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-combined-ca-bundle\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.796023 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-internal-tls-certs\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.796073 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5fbfc8dd66-9z9vj"] Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.804323 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.808766 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-scripts\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.808810 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nwnxp" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.809662 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-fernet-keys\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.809762 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.810049 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.809940 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.810006 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.815874 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/283af06a-8c2b-45cf-b134-0403750866b1-credential-keys\") pod \"keystone-97b4f9d66-l5knv\" (UID: \"283af06a-8c2b-45cf-b134-0403750866b1\") " pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.838415 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5fbfc8dd66-9z9vj"] Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.838983 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.842266 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-combined-ca-bundle\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.842347 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-public-tls-certs\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.842446 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-config-data\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.842496 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-internal-tls-certs\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.842548 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjnjz\" (UniqueName: \"kubernetes.io/projected/f560bfa0-ed4a-4559-aa99-a66df536745b-kube-api-access-wjnjz\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.842573 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f560bfa0-ed4a-4559-aa99-a66df536745b-logs\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.842705 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-scripts\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.945055 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-scripts\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.945180 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-combined-ca-bundle\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.945213 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-public-tls-certs\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.945260 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-config-data\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.945277 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-internal-tls-certs\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.945305 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjnjz\" (UniqueName: \"kubernetes.io/projected/f560bfa0-ed4a-4559-aa99-a66df536745b-kube-api-access-wjnjz\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.945322 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f560bfa0-ed4a-4559-aa99-a66df536745b-logs\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.945756 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f560bfa0-ed4a-4559-aa99-a66df536745b-logs\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.954129 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-config-data\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.958958 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-combined-ca-bundle\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.959730 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-scripts\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.968597 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-internal-tls-certs\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.977629 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f560bfa0-ed4a-4559-aa99-a66df536745b-public-tls-certs\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:33 crc kubenswrapper[4475]: I1203 07:00:33.983133 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjnjz\" (UniqueName: \"kubernetes.io/projected/f560bfa0-ed4a-4559-aa99-a66df536745b-kube-api-access-wjnjz\") pod \"placement-5fbfc8dd66-9z9vj\" (UID: \"f560bfa0-ed4a-4559-aa99-a66df536745b\") " pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:34 crc kubenswrapper[4475]: I1203 07:00:34.171709 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:34 crc kubenswrapper[4475]: I1203 07:00:34.401314 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-97b4f9d66-l5knv"] Dec 03 07:00:34 crc kubenswrapper[4475]: W1203 07:00:34.462235 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod283af06a_8c2b_45cf_b134_0403750866b1.slice/crio-d4e4de6e60a987c10316726543fef7bb7571f19523079733bf8ff39a5db1003e WatchSource:0}: Error finding container d4e4de6e60a987c10316726543fef7bb7571f19523079733bf8ff39a5db1003e: Status 404 returned error can't find the container with id d4e4de6e60a987c10316726543fef7bb7571f19523079733bf8ff39a5db1003e Dec 03 07:00:34 crc kubenswrapper[4475]: I1203 07:00:34.654807 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-97b4f9d66-l5knv" event={"ID":"283af06a-8c2b-45cf-b134-0403750866b1","Type":"ContainerStarted","Data":"d4e4de6e60a987c10316726543fef7bb7571f19523079733bf8ff39a5db1003e"} Dec 03 07:00:34 crc kubenswrapper[4475]: I1203 07:00:34.657098 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zch8h" event={"ID":"8c7df369-c49c-4d2a-842a-a8bd41944f1b","Type":"ContainerStarted","Data":"f02bdef2568fe1d00657c29fe666cc8ff0d786e5650b0c96bef80a7cb8b0c4ca"} Dec 03 07:00:34 crc kubenswrapper[4475]: I1203 07:00:34.676105 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bxx54" event={"ID":"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c","Type":"ContainerStarted","Data":"33e25a10b7c252046437cebe3572fd74821f908db6872722174c5b44aedb9992"} Dec 03 07:00:34 crc kubenswrapper[4475]: I1203 07:00:34.677347 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-zch8h" podStartSLOduration=2.8088123449999998 podStartE2EDuration="48.677333299s" podCreationTimestamp="2025-12-03 06:59:46 +0000 UTC" firstStartedPulling="2025-12-03 06:59:48.342085592 +0000 UTC m=+873.146983926" lastFinishedPulling="2025-12-03 07:00:34.210606546 +0000 UTC m=+919.015504880" observedRunningTime="2025-12-03 07:00:34.668564114 +0000 UTC m=+919.473462447" watchObservedRunningTime="2025-12-03 07:00:34.677333299 +0000 UTC m=+919.482231634" Dec 03 07:00:34 crc kubenswrapper[4475]: I1203 07:00:34.696931 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-bxx54" podStartSLOduration=4.249551483 podStartE2EDuration="49.696919451s" podCreationTimestamp="2025-12-03 06:59:45 +0000 UTC" firstStartedPulling="2025-12-03 06:59:47.756604636 +0000 UTC m=+872.561502971" lastFinishedPulling="2025-12-03 07:00:33.203972605 +0000 UTC m=+918.008870939" observedRunningTime="2025-12-03 07:00:34.693927293 +0000 UTC m=+919.498825627" watchObservedRunningTime="2025-12-03 07:00:34.696919451 +0000 UTC m=+919.501817785" Dec 03 07:00:34 crc kubenswrapper[4475]: I1203 07:00:34.727055 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5fbfc8dd66-9z9vj"] Dec 03 07:00:34 crc kubenswrapper[4475]: I1203 07:00:34.827526 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7fc4d79b88-s8hhg" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.036912 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-695fd7c4bb-h85zh" podUID="1f10037d-d109-4863-b89f-c33f39b0848a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.507241 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65d0b806-aeac-4b82-9792-32e5b25e3c3e" path="/var/lib/kubelet/pods/65d0b806-aeac-4b82-9792-32e5b25e3c3e/volumes" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.586644 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-7546fbf47b-6g4gw" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.586666 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7546fbf47b-6g4gw" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.586734 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-7546fbf47b-6g4gw" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.695611 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fbfc8dd66-9z9vj" event={"ID":"f560bfa0-ed4a-4559-aa99-a66df536745b","Type":"ContainerStarted","Data":"f348cb8127c373fa29945b689b732724d6cfdcdc4fc282e428c793ba667fc2ce"} Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.695915 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fbfc8dd66-9z9vj" event={"ID":"f560bfa0-ed4a-4559-aa99-a66df536745b","Type":"ContainerStarted","Data":"d0f0f5a9228f9378398304d16a839c9ecbb1f9c1cec09d0f1f1366df0a094fd3"} Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.695925 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fbfc8dd66-9z9vj" event={"ID":"f560bfa0-ed4a-4559-aa99-a66df536745b","Type":"ContainerStarted","Data":"f243ff53a646dacb07bd24f64eec5f7cd2e4f84cac1821205fc7b4c8d85b35a7"} Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.696657 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.696676 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.708600 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6ntz5" event={"ID":"2a298c73-a9bf-496a-9192-dcbf3e2417cd","Type":"ContainerStarted","Data":"9aaaa0307bdfcafb53ad99cb2f2a47626244bec1003af7412e10a590a9b7103d"} Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.715575 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-97b4f9d66-l5knv" event={"ID":"283af06a-8c2b-45cf-b134-0403750866b1","Type":"ContainerStarted","Data":"bbbb915a38ededdde5be41f29f557c2d7d35486b69b6060ecc5fc55dbd78b97f"} Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.716296 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.720125 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5fbfc8dd66-9z9vj" podStartSLOduration=2.720114255 podStartE2EDuration="2.720114255s" podCreationTimestamp="2025-12-03 07:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:35.71331785 +0000 UTC m=+920.518216184" watchObservedRunningTime="2025-12-03 07:00:35.720114255 +0000 UTC m=+920.525012589" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.735876 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6ntz5" podStartSLOduration=4.080722379 podStartE2EDuration="50.735865553s" podCreationTimestamp="2025-12-03 06:59:45 +0000 UTC" firstStartedPulling="2025-12-03 06:59:47.669055092 +0000 UTC m=+872.473953426" lastFinishedPulling="2025-12-03 07:00:34.324198267 +0000 UTC m=+919.129096600" observedRunningTime="2025-12-03 07:00:35.733532455 +0000 UTC m=+920.538430790" watchObservedRunningTime="2025-12-03 07:00:35.735865553 +0000 UTC m=+920.540763887" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.752316 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-97b4f9d66-l5knv" podStartSLOduration=2.752303324 podStartE2EDuration="2.752303324s" podCreationTimestamp="2025-12-03 07:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:35.748559392 +0000 UTC m=+920.553457746" watchObservedRunningTime="2025-12-03 07:00:35.752303324 +0000 UTC m=+920.557201659" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.922550 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 07:00:35 crc kubenswrapper[4475]: I1203 07:00:35.922587 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 07:00:36 crc kubenswrapper[4475]: I1203 07:00:36.262158 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 07:00:36 crc kubenswrapper[4475]: I1203 07:00:36.262423 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 07:00:36 crc kubenswrapper[4475]: I1203 07:00:36.320906 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 07:00:36 crc kubenswrapper[4475]: I1203 07:00:36.321922 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 07:00:36 crc kubenswrapper[4475]: I1203 07:00:36.727326 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 07:00:36 crc kubenswrapper[4475]: I1203 07:00:36.728810 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 07:00:37 crc kubenswrapper[4475]: I1203 07:00:37.001282 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bk4n9" podUID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerName="registry-server" probeResult="failure" output=< Dec 03 07:00:37 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 07:00:37 crc kubenswrapper[4475]: > Dec 03 07:00:37 crc kubenswrapper[4475]: I1203 07:00:37.785013 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8798f5df-qtg6w" Dec 03 07:00:37 crc kubenswrapper[4475]: I1203 07:00:37.863481 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7546fbf47b-6g4gw"] Dec 03 07:00:37 crc kubenswrapper[4475]: I1203 07:00:37.863947 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7546fbf47b-6g4gw" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-api" containerID="cri-o://1c4da307c0c467a21013373e22ee9707f5603d2eae92ff464829b1780262ed39" gracePeriod=30 Dec 03 07:00:37 crc kubenswrapper[4475]: I1203 07:00:37.864782 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7546fbf47b-6g4gw" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-httpd" containerID="cri-o://e7c8adddfe68e490bef6ff8bf6eb5ad35a21fe979145033da3306295ded650f6" gracePeriod=30 Dec 03 07:00:37 crc kubenswrapper[4475]: I1203 07:00:37.893356 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7546fbf47b-6g4gw" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.152:9696/\": EOF" Dec 03 07:00:38 crc kubenswrapper[4475]: I1203 07:00:38.745173 4475 generic.go:334] "Generic (PLEG): container finished" podID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerID="e7c8adddfe68e490bef6ff8bf6eb5ad35a21fe979145033da3306295ded650f6" exitCode=0 Dec 03 07:00:38 crc kubenswrapper[4475]: I1203 07:00:38.745555 4475 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 07:00:38 crc kubenswrapper[4475]: I1203 07:00:38.745566 4475 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 07:00:38 crc kubenswrapper[4475]: I1203 07:00:38.745255 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7546fbf47b-6g4gw" event={"ID":"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0","Type":"ContainerDied","Data":"e7c8adddfe68e490bef6ff8bf6eb5ad35a21fe979145033da3306295ded650f6"} Dec 03 07:00:39 crc kubenswrapper[4475]: I1203 07:00:39.757697 4475 generic.go:334] "Generic (PLEG): container finished" podID="8c7df369-c49c-4d2a-842a-a8bd41944f1b" containerID="f02bdef2568fe1d00657c29fe666cc8ff0d786e5650b0c96bef80a7cb8b0c4ca" exitCode=0 Dec 03 07:00:39 crc kubenswrapper[4475]: I1203 07:00:39.758032 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zch8h" event={"ID":"8c7df369-c49c-4d2a-842a-a8bd41944f1b","Type":"ContainerDied","Data":"f02bdef2568fe1d00657c29fe666cc8ff0d786e5650b0c96bef80a7cb8b0c4ca"} Dec 03 07:00:39 crc kubenswrapper[4475]: I1203 07:00:39.817104 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:39 crc kubenswrapper[4475]: I1203 07:00:39.817132 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:39 crc kubenswrapper[4475]: I1203 07:00:39.865183 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:40 crc kubenswrapper[4475]: I1203 07:00:40.431763 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 07:00:40 crc kubenswrapper[4475]: I1203 07:00:40.431872 4475 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 07:00:40 crc kubenswrapper[4475]: I1203 07:00:40.440141 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 07:00:40 crc kubenswrapper[4475]: I1203 07:00:40.779512 4475 generic.go:334] "Generic (PLEG): container finished" podID="7105b12e-7df5-42e5-b0cc-27ea52ea7b1c" containerID="33e25a10b7c252046437cebe3572fd74821f908db6872722174c5b44aedb9992" exitCode=0 Dec 03 07:00:40 crc kubenswrapper[4475]: I1203 07:00:40.779562 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bxx54" event={"ID":"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c","Type":"ContainerDied","Data":"33e25a10b7c252046437cebe3572fd74821f908db6872722174c5b44aedb9992"} Dec 03 07:00:40 crc kubenswrapper[4475]: I1203 07:00:40.792979 4475 generic.go:334] "Generic (PLEG): container finished" podID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerID="1c4da307c0c467a21013373e22ee9707f5603d2eae92ff464829b1780262ed39" exitCode=0 Dec 03 07:00:40 crc kubenswrapper[4475]: I1203 07:00:40.793103 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7546fbf47b-6g4gw" event={"ID":"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0","Type":"ContainerDied","Data":"1c4da307c0c467a21013373e22ee9707f5603d2eae92ff464829b1780262ed39"} Dec 03 07:00:40 crc kubenswrapper[4475]: I1203 07:00:40.861184 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:40 crc kubenswrapper[4475]: I1203 07:00:40.900187 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-glm7p"] Dec 03 07:00:41 crc kubenswrapper[4475]: I1203 07:00:41.810712 4475 generic.go:334] "Generic (PLEG): container finished" podID="2a298c73-a9bf-496a-9192-dcbf3e2417cd" containerID="9aaaa0307bdfcafb53ad99cb2f2a47626244bec1003af7412e10a590a9b7103d" exitCode=0 Dec 03 07:00:41 crc kubenswrapper[4475]: I1203 07:00:41.810848 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6ntz5" event={"ID":"2a298c73-a9bf-496a-9192-dcbf3e2417cd","Type":"ContainerDied","Data":"9aaaa0307bdfcafb53ad99cb2f2a47626244bec1003af7412e10a590a9b7103d"} Dec 03 07:00:42 crc kubenswrapper[4475]: I1203 07:00:42.818120 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-glm7p" podUID="905141fd-3de4-45c4-bffb-45934f8ea6d3" containerName="registry-server" containerID="cri-o://96855b91ce5cbe19b39e5ab377d1507d6946626aee1939449d754538b3cb8d8a" gracePeriod=2 Dec 03 07:00:43 crc kubenswrapper[4475]: I1203 07:00:43.846058 4475 generic.go:334] "Generic (PLEG): container finished" podID="905141fd-3de4-45c4-bffb-45934f8ea6d3" containerID="96855b91ce5cbe19b39e5ab377d1507d6946626aee1939449d754538b3cb8d8a" exitCode=0 Dec 03 07:00:43 crc kubenswrapper[4475]: I1203 07:00:43.846138 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glm7p" event={"ID":"905141fd-3de4-45c4-bffb-45934f8ea6d3","Type":"ContainerDied","Data":"96855b91ce5cbe19b39e5ab377d1507d6946626aee1939449d754538b3cb8d8a"} Dec 03 07:00:44 crc kubenswrapper[4475]: I1203 07:00:44.824968 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7fc4d79b88-s8hhg" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.034315 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-695fd7c4bb-h85zh" podUID="1f10037d-d109-4863-b89f-c33f39b0848a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.111193 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zch8h" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.123565 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bxx54" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.125635 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6ntz5" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.233906 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-config-data\") pod \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.234492 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-scripts\") pod \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.234569 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dktz8\" (UniqueName: \"kubernetes.io/projected/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-kube-api-access-dktz8\") pod \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.234635 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-combined-ca-bundle\") pod \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.234666 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-combined-ca-bundle\") pod \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.234723 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a298c73-a9bf-496a-9192-dcbf3e2417cd-etc-machine-id\") pod \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.234789 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-db-sync-config-data\") pod \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.234869 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-config-data\") pod \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.234956 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l67b7\" (UniqueName: \"kubernetes.io/projected/8c7df369-c49c-4d2a-842a-a8bd41944f1b-kube-api-access-l67b7\") pod \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\" (UID: \"8c7df369-c49c-4d2a-842a-a8bd41944f1b\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.235128 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbqln\" (UniqueName: \"kubernetes.io/projected/2a298c73-a9bf-496a-9192-dcbf3e2417cd-kube-api-access-hbqln\") pod \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.235178 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-db-sync-config-data\") pod \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\" (UID: \"2a298c73-a9bf-496a-9192-dcbf3e2417cd\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.235267 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-combined-ca-bundle\") pod \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\" (UID: \"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.251517 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a298c73-a9bf-496a-9192-dcbf3e2417cd-kube-api-access-hbqln" (OuterVolumeSpecName: "kube-api-access-hbqln") pod "2a298c73-a9bf-496a-9192-dcbf3e2417cd" (UID: "2a298c73-a9bf-496a-9192-dcbf3e2417cd"). InnerVolumeSpecName "kube-api-access-hbqln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.252551 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a298c73-a9bf-496a-9192-dcbf3e2417cd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2a298c73-a9bf-496a-9192-dcbf3e2417cd" (UID: "2a298c73-a9bf-496a-9192-dcbf3e2417cd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.256927 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2a298c73-a9bf-496a-9192-dcbf3e2417cd" (UID: "2a298c73-a9bf-496a-9192-dcbf3e2417cd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.260878 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7df369-c49c-4d2a-842a-a8bd41944f1b-kube-api-access-l67b7" (OuterVolumeSpecName: "kube-api-access-l67b7") pod "8c7df369-c49c-4d2a-842a-a8bd41944f1b" (UID: "8c7df369-c49c-4d2a-842a-a8bd41944f1b"). InnerVolumeSpecName "kube-api-access-l67b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.265684 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-kube-api-access-dktz8" (OuterVolumeSpecName: "kube-api-access-dktz8") pod "7105b12e-7df5-42e5-b0cc-27ea52ea7b1c" (UID: "7105b12e-7df5-42e5-b0cc-27ea52ea7b1c"). InnerVolumeSpecName "kube-api-access-dktz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.265800 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-scripts" (OuterVolumeSpecName: "scripts") pod "2a298c73-a9bf-496a-9192-dcbf3e2417cd" (UID: "2a298c73-a9bf-496a-9192-dcbf3e2417cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: E1203 07:00:45.273096 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="12f47969-3169-43bc-8d07-cbd3952d81cf" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.282658 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8c7df369-c49c-4d2a-842a-a8bd41944f1b" (UID: "8c7df369-c49c-4d2a-842a-a8bd41944f1b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.302017 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c7df369-c49c-4d2a-842a-a8bd41944f1b" (UID: "8c7df369-c49c-4d2a-842a-a8bd41944f1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.328974 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-config-data" (OuterVolumeSpecName: "config-data") pod "2a298c73-a9bf-496a-9192-dcbf3e2417cd" (UID: "2a298c73-a9bf-496a-9192-dcbf3e2417cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.349734 4475 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a298c73-a9bf-496a-9192-dcbf3e2417cd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.350161 4475 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.350263 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l67b7\" (UniqueName: \"kubernetes.io/projected/8c7df369-c49c-4d2a-842a-a8bd41944f1b-kube-api-access-l67b7\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.350503 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbqln\" (UniqueName: \"kubernetes.io/projected/2a298c73-a9bf-496a-9192-dcbf3e2417cd-kube-api-access-hbqln\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.350724 4475 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.352248 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.352355 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.352420 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dktz8\" (UniqueName: \"kubernetes.io/projected/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-kube-api-access-dktz8\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.352564 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7df369-c49c-4d2a-842a-a8bd41944f1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.359727 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7105b12e-7df5-42e5-b0cc-27ea52ea7b1c" (UID: "7105b12e-7df5-42e5-b0cc-27ea52ea7b1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.361929 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a298c73-a9bf-496a-9192-dcbf3e2417cd" (UID: "2a298c73-a9bf-496a-9192-dcbf3e2417cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.386921 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.413115 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-config-data" (OuterVolumeSpecName: "config-data") pod "7105b12e-7df5-42e5-b0cc-27ea52ea7b1c" (UID: "7105b12e-7df5-42e5-b0cc-27ea52ea7b1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.438164 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.454128 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.454156 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a298c73-a9bf-496a-9192-dcbf3e2417cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.454166 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.556531 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-httpd-config\") pod \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.556787 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsdkd\" (UniqueName: \"kubernetes.io/projected/905141fd-3de4-45c4-bffb-45934f8ea6d3-kube-api-access-zsdkd\") pod \"905141fd-3de4-45c4-bffb-45934f8ea6d3\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.557022 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-utilities\") pod \"905141fd-3de4-45c4-bffb-45934f8ea6d3\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.557059 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-catalog-content\") pod \"905141fd-3de4-45c4-bffb-45934f8ea6d3\" (UID: \"905141fd-3de4-45c4-bffb-45934f8ea6d3\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.557162 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-config\") pod \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.557470 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-combined-ca-bundle\") pod \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.557614 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-ovndb-tls-certs\") pod \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.558392 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z57rd\" (UniqueName: \"kubernetes.io/projected/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-kube-api-access-z57rd\") pod \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\" (UID: \"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0\") " Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.560343 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-utilities" (OuterVolumeSpecName: "utilities") pod "905141fd-3de4-45c4-bffb-45934f8ea6d3" (UID: "905141fd-3de4-45c4-bffb-45934f8ea6d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.563101 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-kube-api-access-z57rd" (OuterVolumeSpecName: "kube-api-access-z57rd") pod "d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" (UID: "d620fdbb-7e93-46f8-95e1-18c9f9aab8f0"). InnerVolumeSpecName "kube-api-access-z57rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.563504 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/905141fd-3de4-45c4-bffb-45934f8ea6d3-kube-api-access-zsdkd" (OuterVolumeSpecName: "kube-api-access-zsdkd") pod "905141fd-3de4-45c4-bffb-45934f8ea6d3" (UID: "905141fd-3de4-45c4-bffb-45934f8ea6d3"). InnerVolumeSpecName "kube-api-access-zsdkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.567245 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" (UID: "d620fdbb-7e93-46f8-95e1-18c9f9aab8f0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.599469 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "905141fd-3de4-45c4-bffb-45934f8ea6d3" (UID: "905141fd-3de4-45c4-bffb-45934f8ea6d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.615338 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-config" (OuterVolumeSpecName: "config") pod "d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" (UID: "d620fdbb-7e93-46f8-95e1-18c9f9aab8f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.628208 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" (UID: "d620fdbb-7e93-46f8-95e1-18c9f9aab8f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.630894 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" (UID: "d620fdbb-7e93-46f8-95e1-18c9f9aab8f0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.661210 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsdkd\" (UniqueName: \"kubernetes.io/projected/905141fd-3de4-45c4-bffb-45934f8ea6d3-kube-api-access-zsdkd\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.661232 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.661243 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905141fd-3de4-45c4-bffb-45934f8ea6d3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.661252 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.661260 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.661269 4475 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.661278 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z57rd\" (UniqueName: \"kubernetes.io/projected/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-kube-api-access-z57rd\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.661287 4475 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.865410 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7546fbf47b-6g4gw" event={"ID":"d620fdbb-7e93-46f8-95e1-18c9f9aab8f0","Type":"ContainerDied","Data":"cf949f94f3434d42a7e55b3a181e0c5b1dfbe6fe92ada6cbd05bc62d576eff70"} Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.865683 4475 scope.go:117] "RemoveContainer" containerID="e7c8adddfe68e490bef6ff8bf6eb5ad35a21fe979145033da3306295ded650f6" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.865860 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7546fbf47b-6g4gw" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.880242 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zch8h" event={"ID":"8c7df369-c49c-4d2a-842a-a8bd41944f1b","Type":"ContainerDied","Data":"f1110063056f9aaa08bd65eee60b1677157f130caaa2ec412ba699c1f03516ea"} Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.880311 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zch8h" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.880322 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1110063056f9aaa08bd65eee60b1677157f130caaa2ec412ba699c1f03516ea" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.882277 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12f47969-3169-43bc-8d07-cbd3952d81cf","Type":"ContainerStarted","Data":"17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039"} Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.882521 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12f47969-3169-43bc-8d07-cbd3952d81cf" containerName="sg-core" containerID="cri-o://f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef" gracePeriod=30 Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.882630 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.882684 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12f47969-3169-43bc-8d07-cbd3952d81cf" containerName="proxy-httpd" containerID="cri-o://17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039" gracePeriod=30 Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.905779 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7546fbf47b-6g4gw"] Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.907157 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bxx54" event={"ID":"7105b12e-7df5-42e5-b0cc-27ea52ea7b1c","Type":"ContainerDied","Data":"5eca5fd38acb9ac324ce3db8ce13f397d6e0945b55aea483a2245e0c85156aa5"} Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.907185 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eca5fd38acb9ac324ce3db8ce13f397d6e0945b55aea483a2245e0c85156aa5" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.907237 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bxx54" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.910978 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glm7p" event={"ID":"905141fd-3de4-45c4-bffb-45934f8ea6d3","Type":"ContainerDied","Data":"eda29046bc4e3595e4be167a955ef42f5bb0b044e03c77fad16f08627800df10"} Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.911044 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glm7p" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.923659 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7546fbf47b-6g4gw"] Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.928444 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6ntz5" event={"ID":"2a298c73-a9bf-496a-9192-dcbf3e2417cd","Type":"ContainerDied","Data":"c6a240ea2e6d6cd50729781f184b1e40c6dec36978a085321933384500438fba"} Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.928514 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6a240ea2e6d6cd50729781f184b1e40c6dec36978a085321933384500438fba" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.928564 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6ntz5" Dec 03 07:00:45 crc kubenswrapper[4475]: I1203 07:00:45.939850 4475 scope.go:117] "RemoveContainer" containerID="1c4da307c0c467a21013373e22ee9707f5603d2eae92ff464829b1780262ed39" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.123202 4475 scope.go:117] "RemoveContainer" containerID="96855b91ce5cbe19b39e5ab377d1507d6946626aee1939449d754538b3cb8d8a" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.142157 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-glm7p"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.176233 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-glm7p"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.222932 4475 scope.go:117] "RemoveContainer" containerID="4823326e9a8f847ed1956d42ba0df515a55060f14fecf121285347dba1e9db5f" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.320739 4475 scope.go:117] "RemoveContainer" containerID="c9efe999bd33dc9f26e7714db719ab40201a623e9049dc9640a740b805625359" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.350552 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6ffbb57cbb-mczq9"] Dec 03 07:00:46 crc kubenswrapper[4475]: E1203 07:00:46.350953 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7df369-c49c-4d2a-842a-a8bd41944f1b" containerName="barbican-db-sync" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.350971 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7df369-c49c-4d2a-842a-a8bd41944f1b" containerName="barbican-db-sync" Dec 03 07:00:46 crc kubenswrapper[4475]: E1203 07:00:46.350997 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7105b12e-7df5-42e5-b0cc-27ea52ea7b1c" containerName="heat-db-sync" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351004 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7105b12e-7df5-42e5-b0cc-27ea52ea7b1c" containerName="heat-db-sync" Dec 03 07:00:46 crc kubenswrapper[4475]: E1203 07:00:46.351013 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905141fd-3de4-45c4-bffb-45934f8ea6d3" containerName="extract-content" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351018 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="905141fd-3de4-45c4-bffb-45934f8ea6d3" containerName="extract-content" Dec 03 07:00:46 crc kubenswrapper[4475]: E1203 07:00:46.351032 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a298c73-a9bf-496a-9192-dcbf3e2417cd" containerName="cinder-db-sync" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351037 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a298c73-a9bf-496a-9192-dcbf3e2417cd" containerName="cinder-db-sync" Dec 03 07:00:46 crc kubenswrapper[4475]: E1203 07:00:46.351053 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-api" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351059 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-api" Dec 03 07:00:46 crc kubenswrapper[4475]: E1203 07:00:46.351072 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-httpd" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351078 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-httpd" Dec 03 07:00:46 crc kubenswrapper[4475]: E1203 07:00:46.351088 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905141fd-3de4-45c4-bffb-45934f8ea6d3" containerName="extract-utilities" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351094 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="905141fd-3de4-45c4-bffb-45934f8ea6d3" containerName="extract-utilities" Dec 03 07:00:46 crc kubenswrapper[4475]: E1203 07:00:46.351100 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905141fd-3de4-45c4-bffb-45934f8ea6d3" containerName="registry-server" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351105 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="905141fd-3de4-45c4-bffb-45934f8ea6d3" containerName="registry-server" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351270 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-httpd" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351282 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="7105b12e-7df5-42e5-b0cc-27ea52ea7b1c" containerName="heat-db-sync" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351294 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a298c73-a9bf-496a-9192-dcbf3e2417cd" containerName="cinder-db-sync" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351301 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" containerName="neutron-api" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351310 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7df369-c49c-4d2a-842a-a8bd41944f1b" containerName="barbican-db-sync" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.351324 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="905141fd-3de4-45c4-bffb-45934f8ea6d3" containerName="registry-server" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.352168 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.370632 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.370808 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gghvg" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.370911 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.392739 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7c975977bc-7bx2h"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.394275 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.398153 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.433706 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6ffbb57cbb-mczq9"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.462883 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c975977bc-7bx2h"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.493298 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5003beac-ecfb-4cdc-9184-b08c422848f0-combined-ca-bundle\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.493377 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf7fe140-bbb3-4c39-b19a-388108167ed8-logs\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.493567 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5003beac-ecfb-4cdc-9184-b08c422848f0-config-data\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.493589 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5003beac-ecfb-4cdc-9184-b08c422848f0-logs\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.493833 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf7fe140-bbb3-4c39-b19a-388108167ed8-combined-ca-bundle\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.493866 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf7fe140-bbb3-4c39-b19a-388108167ed8-config-data-custom\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.493947 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs6qg\" (UniqueName: \"kubernetes.io/projected/bf7fe140-bbb3-4c39-b19a-388108167ed8-kube-api-access-zs6qg\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.493998 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5003beac-ecfb-4cdc-9184-b08c422848f0-config-data-custom\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.494017 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtsq9\" (UniqueName: \"kubernetes.io/projected/5003beac-ecfb-4cdc-9184-b08c422848f0-kube-api-access-xtsq9\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.494037 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf7fe140-bbb3-4c39-b19a-388108167ed8-config-data\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.530354 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65d6db7cdf-klwck"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.539190 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.558912 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65d6db7cdf-klwck"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597116 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-config\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597160 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5003beac-ecfb-4cdc-9184-b08c422848f0-config-data\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597180 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5003beac-ecfb-4cdc-9184-b08c422848f0-logs\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597196 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-svc\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597265 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf7fe140-bbb3-4c39-b19a-388108167ed8-combined-ca-bundle\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597288 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-swift-storage-0\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597308 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf7fe140-bbb3-4c39-b19a-388108167ed8-config-data-custom\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597342 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs6qg\" (UniqueName: \"kubernetes.io/projected/bf7fe140-bbb3-4c39-b19a-388108167ed8-kube-api-access-zs6qg\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597356 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znjn4\" (UniqueName: \"kubernetes.io/projected/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-kube-api-access-znjn4\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597377 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5003beac-ecfb-4cdc-9184-b08c422848f0-config-data-custom\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597393 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtsq9\" (UniqueName: \"kubernetes.io/projected/5003beac-ecfb-4cdc-9184-b08c422848f0-kube-api-access-xtsq9\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597414 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf7fe140-bbb3-4c39-b19a-388108167ed8-config-data\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597428 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-nb\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597586 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5003beac-ecfb-4cdc-9184-b08c422848f0-combined-ca-bundle\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597612 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-sb\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597643 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf7fe140-bbb3-4c39-b19a-388108167ed8-logs\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.597946 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf7fe140-bbb3-4c39-b19a-388108167ed8-logs\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.605097 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5003beac-ecfb-4cdc-9184-b08c422848f0-logs\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.618736 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.620082 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.622052 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5003beac-ecfb-4cdc-9184-b08c422848f0-config-data\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.622574 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.622749 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.622856 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ppndb" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.622967 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.623068 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf7fe140-bbb3-4c39-b19a-388108167ed8-config-data\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.626786 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf7fe140-bbb3-4c39-b19a-388108167ed8-config-data-custom\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.627321 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf7fe140-bbb3-4c39-b19a-388108167ed8-combined-ca-bundle\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.627764 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5003beac-ecfb-4cdc-9184-b08c422848f0-combined-ca-bundle\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.639140 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5003beac-ecfb-4cdc-9184-b08c422848f0-config-data-custom\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.658898 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs6qg\" (UniqueName: \"kubernetes.io/projected/bf7fe140-bbb3-4c39-b19a-388108167ed8-kube-api-access-zs6qg\") pod \"barbican-worker-7c975977bc-7bx2h\" (UID: \"bf7fe140-bbb3-4c39-b19a-388108167ed8\") " pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.666039 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-74bf7b4cc4-lpw7v"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.667388 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.670671 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtsq9\" (UniqueName: \"kubernetes.io/projected/5003beac-ecfb-4cdc-9184-b08c422848f0-kube-api-access-xtsq9\") pod \"barbican-keystone-listener-6ffbb57cbb-mczq9\" (UID: \"5003beac-ecfb-4cdc-9184-b08c422848f0\") " pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.681331 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.684744 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.690176 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-74bf7b4cc4-lpw7v"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.701396 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-scripts\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.701441 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.701494 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.701598 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgbkn\" (UniqueName: \"kubernetes.io/projected/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-kube-api-access-rgbkn\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.701674 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-swift-storage-0\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.701758 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.701786 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znjn4\" (UniqueName: \"kubernetes.io/projected/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-kube-api-access-znjn4\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.701845 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-nb\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.701881 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.701960 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-sb\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.702031 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-config\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.702067 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-svc\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.702386 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-swift-storage-0\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.702754 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-svc\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.702948 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-nb\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.703268 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-sb\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.703627 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-config\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.736179 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.739905 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znjn4\" (UniqueName: \"kubernetes.io/projected/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-kube-api-access-znjn4\") pod \"dnsmasq-dns-65d6db7cdf-klwck\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.742903 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c975977bc-7bx2h" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.799250 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65d6db7cdf-klwck"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.807966 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.816584 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.816651 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwv6\" (UniqueName: \"kubernetes.io/projected/c455da32-2255-4caa-8734-33f188430025-kube-api-access-mvwv6\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.816684 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.816728 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data-custom\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.816765 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.816806 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-scripts\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.816825 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-combined-ca-bundle\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.816845 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.816869 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.816900 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c455da32-2255-4caa-8734-33f188430025-logs\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.816918 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgbkn\" (UniqueName: \"kubernetes.io/projected/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-kube-api-access-rgbkn\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.817167 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.821666 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.823917 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.824468 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.826177 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-scripts\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.829095 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.859963 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5967d8988f-ghdmf"] Dec 03 07:00:46 crc kubenswrapper[4475]: E1203 07:00:46.860319 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12f47969-3169-43bc-8d07-cbd3952d81cf" containerName="sg-core" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.860335 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f47969-3169-43bc-8d07-cbd3952d81cf" containerName="sg-core" Dec 03 07:00:46 crc kubenswrapper[4475]: E1203 07:00:46.860348 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12f47969-3169-43bc-8d07-cbd3952d81cf" containerName="proxy-httpd" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.860354 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f47969-3169-43bc-8d07-cbd3952d81cf" containerName="proxy-httpd" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.860525 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="12f47969-3169-43bc-8d07-cbd3952d81cf" containerName="proxy-httpd" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.860548 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="12f47969-3169-43bc-8d07-cbd3952d81cf" containerName="sg-core" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.861395 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.872645 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5967d8988f-ghdmf"] Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.891514 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgbkn\" (UniqueName: \"kubernetes.io/projected/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-kube-api-access-rgbkn\") pod \"cinder-scheduler-0\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.922173 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-scripts\") pod \"12f47969-3169-43bc-8d07-cbd3952d81cf\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.922279 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-combined-ca-bundle\") pod \"12f47969-3169-43bc-8d07-cbd3952d81cf\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.922308 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn5xv\" (UniqueName: \"kubernetes.io/projected/12f47969-3169-43bc-8d07-cbd3952d81cf-kube-api-access-kn5xv\") pod \"12f47969-3169-43bc-8d07-cbd3952d81cf\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.922330 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-log-httpd\") pod \"12f47969-3169-43bc-8d07-cbd3952d81cf\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.922357 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-sg-core-conf-yaml\") pod \"12f47969-3169-43bc-8d07-cbd3952d81cf\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.922425 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-run-httpd\") pod \"12f47969-3169-43bc-8d07-cbd3952d81cf\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.922508 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-config-data\") pod \"12f47969-3169-43bc-8d07-cbd3952d81cf\" (UID: \"12f47969-3169-43bc-8d07-cbd3952d81cf\") " Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.923190 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "12f47969-3169-43bc-8d07-cbd3952d81cf" (UID: "12f47969-3169-43bc-8d07-cbd3952d81cf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925336 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-combined-ca-bundle\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925387 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-sb\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925406 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-config\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925481 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c455da32-2255-4caa-8734-33f188430025-logs\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925548 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-nb\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925605 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwv6\" (UniqueName: \"kubernetes.io/projected/c455da32-2255-4caa-8734-33f188430025-kube-api-access-mvwv6\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925648 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5dzd\" (UniqueName: \"kubernetes.io/projected/00f99628-c2ec-48c9-b266-0f14cbf05570-kube-api-access-t5dzd\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925684 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-swift-storage-0\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925711 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data-custom\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925735 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-svc\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925775 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.925853 4475 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.928043 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-scripts" (OuterVolumeSpecName: "scripts") pod "12f47969-3169-43bc-8d07-cbd3952d81cf" (UID: "12f47969-3169-43bc-8d07-cbd3952d81cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.928308 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c455da32-2255-4caa-8734-33f188430025-logs\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.932125 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12f47969-3169-43bc-8d07-cbd3952d81cf-kube-api-access-kn5xv" (OuterVolumeSpecName: "kube-api-access-kn5xv") pod "12f47969-3169-43bc-8d07-cbd3952d81cf" (UID: "12f47969-3169-43bc-8d07-cbd3952d81cf"). InnerVolumeSpecName "kube-api-access-kn5xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.933273 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.937849 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-combined-ca-bundle\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.938129 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "12f47969-3169-43bc-8d07-cbd3952d81cf" (UID: "12f47969-3169-43bc-8d07-cbd3952d81cf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.949057 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.949745 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data-custom\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.981657 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "12f47969-3169-43bc-8d07-cbd3952d81cf" (UID: "12f47969-3169-43bc-8d07-cbd3952d81cf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.982137 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bk4n9" podUID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerName="registry-server" probeResult="failure" output=< Dec 03 07:00:46 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 07:00:46 crc kubenswrapper[4475]: > Dec 03 07:00:46 crc kubenswrapper[4475]: I1203 07:00:46.988773 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwv6\" (UniqueName: \"kubernetes.io/projected/c455da32-2255-4caa-8734-33f188430025-kube-api-access-mvwv6\") pod \"barbican-api-74bf7b4cc4-lpw7v\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.001658 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.002710 4475 generic.go:334] "Generic (PLEG): container finished" podID="12f47969-3169-43bc-8d07-cbd3952d81cf" containerID="17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039" exitCode=0 Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.002732 4475 generic.go:334] "Generic (PLEG): container finished" podID="12f47969-3169-43bc-8d07-cbd3952d81cf" containerID="f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef" exitCode=2 Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.002767 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12f47969-3169-43bc-8d07-cbd3952d81cf","Type":"ContainerDied","Data":"17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039"} Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.002790 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12f47969-3169-43bc-8d07-cbd3952d81cf","Type":"ContainerDied","Data":"f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef"} Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.002799 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12f47969-3169-43bc-8d07-cbd3952d81cf","Type":"ContainerDied","Data":"3bf00b5dcf3a8d6ce59e1bf97626dae6e3be34fadab64da4a55617de512b9a49"} Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.002811 4475 scope.go:117] "RemoveContainer" containerID="17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.002907 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.031856 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-sb\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.031888 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-config\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.031944 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-nb\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.032006 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5dzd\" (UniqueName: \"kubernetes.io/projected/00f99628-c2ec-48c9-b266-0f14cbf05570-kube-api-access-t5dzd\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.032031 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-swift-storage-0\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.032050 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-svc\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.032112 4475 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12f47969-3169-43bc-8d07-cbd3952d81cf-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.032122 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.032131 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn5xv\" (UniqueName: \"kubernetes.io/projected/12f47969-3169-43bc-8d07-cbd3952d81cf-kube-api-access-kn5xv\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.032139 4475 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.032888 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-svc\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.032952 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-config\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.033734 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-nb\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.033812 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-sb\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.034436 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-swift-storage-0\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.050071 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12f47969-3169-43bc-8d07-cbd3952d81cf" (UID: "12f47969-3169-43bc-8d07-cbd3952d81cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.073560 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.085240 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5dzd\" (UniqueName: \"kubernetes.io/projected/00f99628-c2ec-48c9-b266-0f14cbf05570-kube-api-access-t5dzd\") pod \"dnsmasq-dns-5967d8988f-ghdmf\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.085967 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.094033 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.099437 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.129755 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-config-data" (OuterVolumeSpecName: "config-data") pod "12f47969-3169-43bc-8d07-cbd3952d81cf" (UID: "12f47969-3169-43bc-8d07-cbd3952d81cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.133543 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.133573 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f47969-3169-43bc-8d07-cbd3952d81cf-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.194759 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.241127 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-scripts\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.241429 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.241530 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data-custom\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.241704 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqqkk\" (UniqueName: \"kubernetes.io/projected/ffaaa182-c947-4cbd-b96d-378fae973360-kube-api-access-sqqkk\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.241739 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ffaaa182-c947-4cbd-b96d-378fae973360-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.241823 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.241870 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffaaa182-c947-4cbd-b96d-378fae973360-logs\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.283096 4475 scope.go:117] "RemoveContainer" containerID="f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.339336 4475 scope.go:117] "RemoveContainer" containerID="17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039" Dec 03 07:00:47 crc kubenswrapper[4475]: E1203 07:00:47.340131 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039\": container with ID starting with 17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039 not found: ID does not exist" containerID="17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.340191 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039"} err="failed to get container status \"17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039\": rpc error: code = NotFound desc = could not find container \"17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039\": container with ID starting with 17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039 not found: ID does not exist" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.340215 4475 scope.go:117] "RemoveContainer" containerID="f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.344378 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-scripts\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.344432 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.344485 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data-custom\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.344559 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqqkk\" (UniqueName: \"kubernetes.io/projected/ffaaa182-c947-4cbd-b96d-378fae973360-kube-api-access-sqqkk\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.344614 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ffaaa182-c947-4cbd-b96d-378fae973360-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.344721 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.346902 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ffaaa182-c947-4cbd-b96d-378fae973360-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.350490 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffaaa182-c947-4cbd-b96d-378fae973360-logs\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.351052 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffaaa182-c947-4cbd-b96d-378fae973360-logs\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.353009 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-scripts\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.353510 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data-custom\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: E1203 07:00:47.359438 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef\": container with ID starting with f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef not found: ID does not exist" containerID="f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.359564 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef"} err="failed to get container status \"f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef\": rpc error: code = NotFound desc = could not find container \"f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef\": container with ID starting with f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef not found: ID does not exist" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.359651 4475 scope.go:117] "RemoveContainer" containerID="17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.359820 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.360105 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039"} err="failed to get container status \"17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039\": rpc error: code = NotFound desc = could not find container \"17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039\": container with ID starting with 17f255f8052085fee0b8356471d6fb5227d356feabddde30f1a623a8125e1039 not found: ID does not exist" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.360189 4475 scope.go:117] "RemoveContainer" containerID="f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.360442 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef"} err="failed to get container status \"f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef\": rpc error: code = NotFound desc = could not find container \"f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef\": container with ID starting with f0a7feb56b6dec7c49849c0c810bfe251e8c688d3f3e7551f990b104406979ef not found: ID does not exist" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.364394 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.387501 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqqkk\" (UniqueName: \"kubernetes.io/projected/ffaaa182-c947-4cbd-b96d-378fae973360-kube-api-access-sqqkk\") pod \"cinder-api-0\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.425511 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.436608 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.546186 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.551201 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12f47969-3169-43bc-8d07-cbd3952d81cf" path="/var/lib/kubelet/pods/12f47969-3169-43bc-8d07-cbd3952d81cf/volumes" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.557223 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="905141fd-3de4-45c4-bffb-45934f8ea6d3" path="/var/lib/kubelet/pods/905141fd-3de4-45c4-bffb-45934f8ea6d3/volumes" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.558162 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d620fdbb-7e93-46f8-95e1-18c9f9aab8f0" path="/var/lib/kubelet/pods/d620fdbb-7e93-46f8-95e1-18c9f9aab8f0/volumes" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.569060 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.572466 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.572640 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.577270 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.577865 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.586495 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65d6db7cdf-klwck"] Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.598481 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c975977bc-7bx2h"] Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.682079 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-log-httpd\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.682116 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-config-data\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.682141 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qq58\" (UniqueName: \"kubernetes.io/projected/13ed03b8-8758-4c25-b37f-2793697026d2-kube-api-access-7qq58\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.682158 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.682188 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-scripts\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.682203 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.682250 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-run-httpd\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.783636 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-log-httpd\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.783676 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-config-data\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.783708 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qq58\" (UniqueName: \"kubernetes.io/projected/13ed03b8-8758-4c25-b37f-2793697026d2-kube-api-access-7qq58\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.783729 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.783789 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-scripts\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.783809 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.783901 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-run-httpd\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.784010 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-log-httpd\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.784240 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-run-httpd\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.796637 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-config-data\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.797013 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.797590 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-scripts\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.797739 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.804899 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qq58\" (UniqueName: \"kubernetes.io/projected/13ed03b8-8758-4c25-b37f-2793697026d2-kube-api-access-7qq58\") pod \"ceilometer-0\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.815701 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6ffbb57cbb-mczq9"] Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.901219 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:00:47 crc kubenswrapper[4475]: I1203 07:00:47.930434 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-74bf7b4cc4-lpw7v"] Dec 03 07:00:47 crc kubenswrapper[4475]: W1203 07:00:47.940870 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc455da32_2255_4caa_8734_33f188430025.slice/crio-2143e919960089e8af4dbf03aab11a09ee9e49002439708ae9c9c6f106a452f7 WatchSource:0}: Error finding container 2143e919960089e8af4dbf03aab11a09ee9e49002439708ae9c9c6f106a452f7: Status 404 returned error can't find the container with id 2143e919960089e8af4dbf03aab11a09ee9e49002439708ae9c9c6f106a452f7 Dec 03 07:00:48 crc kubenswrapper[4475]: I1203 07:00:48.053962 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:00:48 crc kubenswrapper[4475]: I1203 07:00:48.065479 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5967d8988f-ghdmf"] Dec 03 07:00:48 crc kubenswrapper[4475]: I1203 07:00:48.076845 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" event={"ID":"c455da32-2255-4caa-8734-33f188430025","Type":"ContainerStarted","Data":"2143e919960089e8af4dbf03aab11a09ee9e49002439708ae9c9c6f106a452f7"} Dec 03 07:00:48 crc kubenswrapper[4475]: I1203 07:00:48.078094 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" event={"ID":"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d","Type":"ContainerStarted","Data":"ca56a6f0e168e7fdd7a30ecdd79a38ee03e9370ef169c15e85ccb4fd781e1a03"} Dec 03 07:00:48 crc kubenswrapper[4475]: W1203 07:00:48.090350 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11718fbb_3775_43f1_bd7f_d1ca4ad5800f.slice/crio-8b9223166148769c287778289f2bc7f4f701bff862557152cf85cefee0a07700 WatchSource:0}: Error finding container 8b9223166148769c287778289f2bc7f4f701bff862557152cf85cefee0a07700: Status 404 returned error can't find the container with id 8b9223166148769c287778289f2bc7f4f701bff862557152cf85cefee0a07700 Dec 03 07:00:48 crc kubenswrapper[4475]: I1203 07:00:48.107028 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" event={"ID":"5003beac-ecfb-4cdc-9184-b08c422848f0","Type":"ContainerStarted","Data":"b3b04040c688ba318d5c756dde3bb35764d2d0a737caaeaa7dcc3a3af14e5878"} Dec 03 07:00:48 crc kubenswrapper[4475]: I1203 07:00:48.138622 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c975977bc-7bx2h" event={"ID":"bf7fe140-bbb3-4c39-b19a-388108167ed8","Type":"ContainerStarted","Data":"36b40fea7025560ae296f4e47ea847c76f85ed87aad387274d2c155ab5faf0a2"} Dec 03 07:00:48 crc kubenswrapper[4475]: I1203 07:00:48.280924 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:00:48 crc kubenswrapper[4475]: I1203 07:00:48.558933 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:00:48 crc kubenswrapper[4475]: I1203 07:00:48.973025 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.175624 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11718fbb-3775-43f1-bd7f-d1ca4ad5800f","Type":"ContainerStarted","Data":"8b9223166148769c287778289f2bc7f4f701bff862557152cf85cefee0a07700"} Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.185619 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" event={"ID":"c455da32-2255-4caa-8734-33f188430025","Type":"ContainerStarted","Data":"2f9e3b1825a5d1e4ce1abb4a22ecca2925405b28ccc555a7a84ba992149762c3"} Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.185659 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" event={"ID":"c455da32-2255-4caa-8734-33f188430025","Type":"ContainerStarted","Data":"0364d7b9118aa91002ebda52e2150fab9170d02391fd185f2ba25f286a490497"} Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.186537 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.186573 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.193419 4475 generic.go:334] "Generic (PLEG): container finished" podID="00f99628-c2ec-48c9-b266-0f14cbf05570" containerID="6ee0f40e8dd95142d94825e51f954cf267236b294a7d43db3f3810d5c60bdc83" exitCode=0 Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.193512 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" event={"ID":"00f99628-c2ec-48c9-b266-0f14cbf05570","Type":"ContainerDied","Data":"6ee0f40e8dd95142d94825e51f954cf267236b294a7d43db3f3810d5c60bdc83"} Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.193538 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" event={"ID":"00f99628-c2ec-48c9-b266-0f14cbf05570","Type":"ContainerStarted","Data":"86923467babacbdac85b8413f4adc00e5627cf676ead74fad9c3c535839a95ac"} Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.213033 4475 generic.go:334] "Generic (PLEG): container finished" podID="6001f40d-4ed3-4baa-a858-80cfbe5bcf7d" containerID="411268c9c3a1628da7eacc85146977ac0a22fc8f0c1c085445957c4921277a6a" exitCode=0 Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.213199 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" event={"ID":"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d","Type":"ContainerDied","Data":"411268c9c3a1628da7eacc85146977ac0a22fc8f0c1c085445957c4921277a6a"} Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.218911 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffaaa182-c947-4cbd-b96d-378fae973360","Type":"ContainerStarted","Data":"73990065aae99ce6c01af42a31dca6c52ae93fe6af31c92b25d47333f6c98899"} Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.223175 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13ed03b8-8758-4c25-b37f-2793697026d2","Type":"ContainerStarted","Data":"fd4b56831fabb1f0ef5e8f8323b47411850d5b389e6244c31c322b47e9eb172c"} Dec 03 07:00:49 crc kubenswrapper[4475]: I1203 07:00:49.248576 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" podStartSLOduration=3.248530957 podStartE2EDuration="3.248530957s" podCreationTimestamp="2025-12-03 07:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:49.230329539 +0000 UTC m=+934.035227873" watchObservedRunningTime="2025-12-03 07:00:49.248530957 +0000 UTC m=+934.053429291" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.193545 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.237396 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" event={"ID":"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d","Type":"ContainerDied","Data":"ca56a6f0e168e7fdd7a30ecdd79a38ee03e9370ef169c15e85ccb4fd781e1a03"} Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.237445 4475 scope.go:117] "RemoveContainer" containerID="411268c9c3a1628da7eacc85146977ac0a22fc8f0c1c085445957c4921277a6a" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.237585 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65d6db7cdf-klwck" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.244466 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffaaa182-c947-4cbd-b96d-378fae973360","Type":"ContainerStarted","Data":"15564d5c21e647eb083975141782f65b426ec7252db05a340c200804edf55f53"} Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.305134 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znjn4\" (UniqueName: \"kubernetes.io/projected/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-kube-api-access-znjn4\") pod \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.305625 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-sb\") pod \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.305759 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-config\") pod \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.305797 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-swift-storage-0\") pod \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.305875 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-nb\") pod \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.305921 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-svc\") pod \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\" (UID: \"6001f40d-4ed3-4baa-a858-80cfbe5bcf7d\") " Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.320806 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-kube-api-access-znjn4" (OuterVolumeSpecName: "kube-api-access-znjn4") pod "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d" (UID: "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d"). InnerVolumeSpecName "kube-api-access-znjn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.337240 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d" (UID: "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.346463 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-config" (OuterVolumeSpecName: "config") pod "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d" (UID: "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.359227 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d" (UID: "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.359692 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d" (UID: "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.362698 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d" (UID: "6001f40d-4ed3-4baa-a858-80cfbe5bcf7d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.420829 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.420866 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.420880 4475 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.420888 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.420896 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.420904 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znjn4\" (UniqueName: \"kubernetes.io/projected/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d-kube-api-access-znjn4\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.599052 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65d6db7cdf-klwck"] Dec 03 07:00:50 crc kubenswrapper[4475]: I1203 07:00:50.620116 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65d6db7cdf-klwck"] Dec 03 07:00:51 crc kubenswrapper[4475]: I1203 07:00:51.253166 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" event={"ID":"00f99628-c2ec-48c9-b266-0f14cbf05570","Type":"ContainerStarted","Data":"861b74ee61dba1c1ca73b38d2d84a136f245dfb1fc4a5895e8a19271c4a9939a"} Dec 03 07:00:51 crc kubenswrapper[4475]: I1203 07:00:51.254194 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:51 crc kubenswrapper[4475]: I1203 07:00:51.256316 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffaaa182-c947-4cbd-b96d-378fae973360","Type":"ContainerStarted","Data":"a342851e48b4f81a5ac7652402e4a05cdd8410849baf25d75ddc6e72b7854358"} Dec 03 07:00:51 crc kubenswrapper[4475]: I1203 07:00:51.256566 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ffaaa182-c947-4cbd-b96d-378fae973360" containerName="cinder-api-log" containerID="cri-o://15564d5c21e647eb083975141782f65b426ec7252db05a340c200804edf55f53" gracePeriod=30 Dec 03 07:00:51 crc kubenswrapper[4475]: I1203 07:00:51.256655 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ffaaa182-c947-4cbd-b96d-378fae973360" containerName="cinder-api" containerID="cri-o://a342851e48b4f81a5ac7652402e4a05cdd8410849baf25d75ddc6e72b7854358" gracePeriod=30 Dec 03 07:00:51 crc kubenswrapper[4475]: I1203 07:00:51.259199 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13ed03b8-8758-4c25-b37f-2793697026d2","Type":"ContainerStarted","Data":"314eebb9f24bd7ca6ede27d5040984fbf03c937af31d92e1f4dfa30c219712b4"} Dec 03 07:00:51 crc kubenswrapper[4475]: I1203 07:00:51.260404 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11718fbb-3775-43f1-bd7f-d1ca4ad5800f","Type":"ContainerStarted","Data":"e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785"} Dec 03 07:00:51 crc kubenswrapper[4475]: I1203 07:00:51.271547 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" podStartSLOduration=5.271537028 podStartE2EDuration="5.271537028s" podCreationTimestamp="2025-12-03 07:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:51.269246779 +0000 UTC m=+936.074145114" watchObservedRunningTime="2025-12-03 07:00:51.271537028 +0000 UTC m=+936.076435362" Dec 03 07:00:51 crc kubenswrapper[4475]: I1203 07:00:51.288957 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.288946085 podStartE2EDuration="5.288946085s" podCreationTimestamp="2025-12-03 07:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:51.283956809 +0000 UTC m=+936.088855143" watchObservedRunningTime="2025-12-03 07:00:51.288946085 +0000 UTC m=+936.093844419" Dec 03 07:00:51 crc kubenswrapper[4475]: I1203 07:00:51.499544 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6001f40d-4ed3-4baa-a858-80cfbe5bcf7d" path="/var/lib/kubelet/pods/6001f40d-4ed3-4baa-a858-80cfbe5bcf7d/volumes" Dec 03 07:00:52 crc kubenswrapper[4475]: I1203 07:00:52.279698 4475 generic.go:334] "Generic (PLEG): container finished" podID="ffaaa182-c947-4cbd-b96d-378fae973360" containerID="15564d5c21e647eb083975141782f65b426ec7252db05a340c200804edf55f53" exitCode=143 Dec 03 07:00:52 crc kubenswrapper[4475]: I1203 07:00:52.279794 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffaaa182-c947-4cbd-b96d-378fae973360","Type":"ContainerDied","Data":"15564d5c21e647eb083975141782f65b426ec7252db05a340c200804edf55f53"} Dec 03 07:00:52 crc kubenswrapper[4475]: I1203 07:00:52.547737 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.255271 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-c97b994bb-77qkf"] Dec 03 07:00:53 crc kubenswrapper[4475]: E1203 07:00:53.257222 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6001f40d-4ed3-4baa-a858-80cfbe5bcf7d" containerName="init" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.257251 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6001f40d-4ed3-4baa-a858-80cfbe5bcf7d" containerName="init" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.257477 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="6001f40d-4ed3-4baa-a858-80cfbe5bcf7d" containerName="init" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.258581 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.260575 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.261749 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.282317 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-config-data\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.282378 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-combined-ca-bundle\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.282411 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/131d36c8-2ff3-422f-a2f2-25ae01406238-logs\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.282441 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-config-data-custom\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.282520 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-internal-tls-certs\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.282540 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-public-tls-certs\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.282563 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fksx\" (UniqueName: \"kubernetes.io/projected/131d36c8-2ff3-422f-a2f2-25ae01406238-kube-api-access-8fksx\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.292667 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c97b994bb-77qkf"] Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.314345 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" event={"ID":"5003beac-ecfb-4cdc-9184-b08c422848f0","Type":"ContainerStarted","Data":"240ff20fa19e48da7e31e9f18e25f26fb3d5362284d7e4362661ce45bf7e6e14"} Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.314386 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" event={"ID":"5003beac-ecfb-4cdc-9184-b08c422848f0","Type":"ContainerStarted","Data":"a5c1c3cc568cf3ed278c4e913f027e6e964a47f287ee7b2640afdcc151ad9289"} Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.316223 4475 generic.go:334] "Generic (PLEG): container finished" podID="e786a238-51fe-464f-bcc8-54d35b24e9cf" containerID="5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db" exitCode=137 Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.316273 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db4d85d89-t96xs" event={"ID":"e786a238-51fe-464f-bcc8-54d35b24e9cf","Type":"ContainerDied","Data":"5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db"} Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.320859 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13ed03b8-8758-4c25-b37f-2793697026d2","Type":"ContainerStarted","Data":"48b539995f961f39f0f3784b46e123e5caf6c9bfe6e2f1621226b6e95eb4a268"} Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.320898 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13ed03b8-8758-4c25-b37f-2793697026d2","Type":"ContainerStarted","Data":"74a1bf735d06e69f49a7a5836c0ac147e44657c1057d939890bcb84df13ddf46"} Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.325541 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11718fbb-3775-43f1-bd7f-d1ca4ad5800f","Type":"ContainerStarted","Data":"cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e"} Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.328784 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c975977bc-7bx2h" event={"ID":"bf7fe140-bbb3-4c39-b19a-388108167ed8","Type":"ContainerStarted","Data":"91591d54bd9f3640bf6ab7e6b2a54f424a1d80ed38d6aa52e02cb66e285a362d"} Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.328815 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c975977bc-7bx2h" event={"ID":"bf7fe140-bbb3-4c39-b19a-388108167ed8","Type":"ContainerStarted","Data":"f745e2f004e4c65de7c836ebe4208aa1adc185b6d8424d04ea2f9f05e38c532a"} Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.341286 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6ffbb57cbb-mczq9" podStartSLOduration=3.101762839 podStartE2EDuration="7.341275544s" podCreationTimestamp="2025-12-03 07:00:46 +0000 UTC" firstStartedPulling="2025-12-03 07:00:47.828274413 +0000 UTC m=+932.633172747" lastFinishedPulling="2025-12-03 07:00:52.067787118 +0000 UTC m=+936.872685452" observedRunningTime="2025-12-03 07:00:53.331712586 +0000 UTC m=+938.136610920" watchObservedRunningTime="2025-12-03 07:00:53.341275544 +0000 UTC m=+938.146173869" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.359080 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.678958485 podStartE2EDuration="7.359061731s" podCreationTimestamp="2025-12-03 07:00:46 +0000 UTC" firstStartedPulling="2025-12-03 07:00:48.107792439 +0000 UTC m=+932.912690773" lastFinishedPulling="2025-12-03 07:00:49.787895685 +0000 UTC m=+934.592794019" observedRunningTime="2025-12-03 07:00:53.354065833 +0000 UTC m=+938.158964168" watchObservedRunningTime="2025-12-03 07:00:53.359061731 +0000 UTC m=+938.163960065" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.390299 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-config-data\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.390376 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-combined-ca-bundle\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.390441 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/131d36c8-2ff3-422f-a2f2-25ae01406238-logs\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.390511 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-config-data-custom\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.390626 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-internal-tls-certs\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.390674 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-public-tls-certs\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.390703 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fksx\" (UniqueName: \"kubernetes.io/projected/131d36c8-2ff3-422f-a2f2-25ae01406238-kube-api-access-8fksx\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.400566 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-public-tls-certs\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.401268 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-internal-tls-certs\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.401569 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/131d36c8-2ff3-422f-a2f2-25ae01406238-logs\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.403119 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-config-data-custom\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.405049 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-config-data\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.409271 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131d36c8-2ff3-422f-a2f2-25ae01406238-combined-ca-bundle\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.433966 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fksx\" (UniqueName: \"kubernetes.io/projected/131d36c8-2ff3-422f-a2f2-25ae01406238-kube-api-access-8fksx\") pod \"barbican-api-c97b994bb-77qkf\" (UID: \"131d36c8-2ff3-422f-a2f2-25ae01406238\") " pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.626820 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.733886 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.766937 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7c975977bc-7bx2h" podStartSLOduration=3.45253131 podStartE2EDuration="7.7669143s" podCreationTimestamp="2025-12-03 07:00:46 +0000 UTC" firstStartedPulling="2025-12-03 07:00:47.746289877 +0000 UTC m=+932.551188211" lastFinishedPulling="2025-12-03 07:00:52.060672866 +0000 UTC m=+936.865571201" observedRunningTime="2025-12-03 07:00:53.386878766 +0000 UTC m=+938.191777100" watchObservedRunningTime="2025-12-03 07:00:53.7669143 +0000 UTC m=+938.571812634" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.804969 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e786a238-51fe-464f-bcc8-54d35b24e9cf-logs\") pod \"e786a238-51fe-464f-bcc8-54d35b24e9cf\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.805052 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9lnc\" (UniqueName: \"kubernetes.io/projected/e786a238-51fe-464f-bcc8-54d35b24e9cf-kube-api-access-g9lnc\") pod \"e786a238-51fe-464f-bcc8-54d35b24e9cf\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.805076 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e786a238-51fe-464f-bcc8-54d35b24e9cf-horizon-secret-key\") pod \"e786a238-51fe-464f-bcc8-54d35b24e9cf\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.805142 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-scripts\") pod \"e786a238-51fe-464f-bcc8-54d35b24e9cf\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.805285 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-config-data\") pod \"e786a238-51fe-464f-bcc8-54d35b24e9cf\" (UID: \"e786a238-51fe-464f-bcc8-54d35b24e9cf\") " Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.805402 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e786a238-51fe-464f-bcc8-54d35b24e9cf-logs" (OuterVolumeSpecName: "logs") pod "e786a238-51fe-464f-bcc8-54d35b24e9cf" (UID: "e786a238-51fe-464f-bcc8-54d35b24e9cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.805690 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e786a238-51fe-464f-bcc8-54d35b24e9cf-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.816297 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e786a238-51fe-464f-bcc8-54d35b24e9cf-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e786a238-51fe-464f-bcc8-54d35b24e9cf" (UID: "e786a238-51fe-464f-bcc8-54d35b24e9cf"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.820931 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e786a238-51fe-464f-bcc8-54d35b24e9cf-kube-api-access-g9lnc" (OuterVolumeSpecName: "kube-api-access-g9lnc") pod "e786a238-51fe-464f-bcc8-54d35b24e9cf" (UID: "e786a238-51fe-464f-bcc8-54d35b24e9cf"). InnerVolumeSpecName "kube-api-access-g9lnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.836200 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-scripts" (OuterVolumeSpecName: "scripts") pod "e786a238-51fe-464f-bcc8-54d35b24e9cf" (UID: "e786a238-51fe-464f-bcc8-54d35b24e9cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.855399 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-config-data" (OuterVolumeSpecName: "config-data") pod "e786a238-51fe-464f-bcc8-54d35b24e9cf" (UID: "e786a238-51fe-464f-bcc8-54d35b24e9cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.907904 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.907942 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e786a238-51fe-464f-bcc8-54d35b24e9cf-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.907953 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9lnc\" (UniqueName: \"kubernetes.io/projected/e786a238-51fe-464f-bcc8-54d35b24e9cf-kube-api-access-g9lnc\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:53 crc kubenswrapper[4475]: I1203 07:00:53.907963 4475 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e786a238-51fe-464f-bcc8-54d35b24e9cf-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:54 crc kubenswrapper[4475]: W1203 07:00:54.156264 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod131d36c8_2ff3_422f_a2f2_25ae01406238.slice/crio-246f3f774a00bac36dda48f929878b3c730cb14bd678c4d9dc4385b4fe97ac9e WatchSource:0}: Error finding container 246f3f774a00bac36dda48f929878b3c730cb14bd678c4d9dc4385b4fe97ac9e: Status 404 returned error can't find the container with id 246f3f774a00bac36dda48f929878b3c730cb14bd678c4d9dc4385b4fe97ac9e Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.170885 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c97b994bb-77qkf"] Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.336776 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c97b994bb-77qkf" event={"ID":"131d36c8-2ff3-422f-a2f2-25ae01406238","Type":"ContainerStarted","Data":"246f3f774a00bac36dda48f929878b3c730cb14bd678c4d9dc4385b4fe97ac9e"} Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.340302 4475 generic.go:334] "Generic (PLEG): container finished" podID="e786a238-51fe-464f-bcc8-54d35b24e9cf" containerID="ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb" exitCode=137 Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.340343 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db4d85d89-t96xs" event={"ID":"e786a238-51fe-464f-bcc8-54d35b24e9cf","Type":"ContainerDied","Data":"ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb"} Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.340385 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db4d85d89-t96xs" event={"ID":"e786a238-51fe-464f-bcc8-54d35b24e9cf","Type":"ContainerDied","Data":"600ab82ac7c0904a342765ac63ab9049e1a51766557f63ff93fc473c645ab592"} Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.340403 4475 scope.go:117] "RemoveContainer" containerID="ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb" Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.340429 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5db4d85d89-t96xs" Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.383520 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5db4d85d89-t96xs"] Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.393306 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5db4d85d89-t96xs"] Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.522908 4475 scope.go:117] "RemoveContainer" containerID="5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db" Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.547781 4475 scope.go:117] "RemoveContainer" containerID="ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb" Dec 03 07:00:54 crc kubenswrapper[4475]: E1203 07:00:54.556270 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb\": container with ID starting with ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb not found: ID does not exist" containerID="ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb" Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.556408 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb"} err="failed to get container status \"ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb\": rpc error: code = NotFound desc = could not find container \"ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb\": container with ID starting with ba8127af1ef75e1512a821af33fc5ef2098d06e9a5ce6b02882490f4c4a3eabb not found: ID does not exist" Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.556518 4475 scope.go:117] "RemoveContainer" containerID="5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db" Dec 03 07:00:54 crc kubenswrapper[4475]: E1203 07:00:54.569357 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db\": container with ID starting with 5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db not found: ID does not exist" containerID="5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db" Dec 03 07:00:54 crc kubenswrapper[4475]: I1203 07:00:54.569603 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db"} err="failed to get container status \"5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db\": rpc error: code = NotFound desc = could not find container \"5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db\": container with ID starting with 5f7071580d3dd8c9404a7e72e85f7852838a7bb54f8158e86de328bc128c65db not found: ID does not exist" Dec 03 07:00:55 crc kubenswrapper[4475]: I1203 07:00:55.362378 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c97b994bb-77qkf" event={"ID":"131d36c8-2ff3-422f-a2f2-25ae01406238","Type":"ContainerStarted","Data":"17d0c3cf5e27c5040400503e8d219406eeb106201174f0f65e2cb8b9d2397220"} Dec 03 07:00:55 crc kubenswrapper[4475]: I1203 07:00:55.363345 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:55 crc kubenswrapper[4475]: I1203 07:00:55.363426 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c97b994bb-77qkf" event={"ID":"131d36c8-2ff3-422f-a2f2-25ae01406238","Type":"ContainerStarted","Data":"dd9fa21dfd1eff4b1987731ac3bacb8dcc1d110e2b8c77bbb08cd0b4570a493b"} Dec 03 07:00:55 crc kubenswrapper[4475]: I1203 07:00:55.363518 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:00:55 crc kubenswrapper[4475]: I1203 07:00:55.376402 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13ed03b8-8758-4c25-b37f-2793697026d2","Type":"ContainerStarted","Data":"14c25df4b313807582231974fd3e2f0724e4e3d96f0797158c2d85787b388154"} Dec 03 07:00:55 crc kubenswrapper[4475]: I1203 07:00:55.376621 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:00:55 crc kubenswrapper[4475]: I1203 07:00:55.406816 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-c97b994bb-77qkf" podStartSLOduration=2.406803281 podStartE2EDuration="2.406803281s" podCreationTimestamp="2025-12-03 07:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:00:55.400360482 +0000 UTC m=+940.205258835" watchObservedRunningTime="2025-12-03 07:00:55.406803281 +0000 UTC m=+940.211701615" Dec 03 07:00:55 crc kubenswrapper[4475]: I1203 07:00:55.425032 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.458132878 podStartE2EDuration="8.425021621s" podCreationTimestamp="2025-12-03 07:00:47 +0000 UTC" firstStartedPulling="2025-12-03 07:00:48.589474243 +0000 UTC m=+933.394372577" lastFinishedPulling="2025-12-03 07:00:54.556362986 +0000 UTC m=+939.361261320" observedRunningTime="2025-12-03 07:00:55.418537674 +0000 UTC m=+940.223436008" watchObservedRunningTime="2025-12-03 07:00:55.425021621 +0000 UTC m=+940.229919954" Dec 03 07:00:55 crc kubenswrapper[4475]: I1203 07:00:55.502501 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e786a238-51fe-464f-bcc8-54d35b24e9cf" path="/var/lib/kubelet/pods/e786a238-51fe-464f-bcc8-54d35b24e9cf/volumes" Dec 03 07:00:55 crc kubenswrapper[4475]: I1203 07:00:55.991623 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 07:00:56 crc kubenswrapper[4475]: I1203 07:00:56.036341 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 07:00:56 crc kubenswrapper[4475]: I1203 07:00:56.103382 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bk4n9"] Dec 03 07:00:56 crc kubenswrapper[4475]: I1203 07:00:56.950293 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 07:00:57 crc kubenswrapper[4475]: I1203 07:00:57.196806 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:00:57 crc kubenswrapper[4475]: I1203 07:00:57.209508 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 07:00:57 crc kubenswrapper[4475]: I1203 07:00:57.285696 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7bf79b95-knv75"] Dec 03 07:00:57 crc kubenswrapper[4475]: I1203 07:00:57.285960 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" podUID="f1be585f-cb41-49eb-83b7-9c25c757c739" containerName="dnsmasq-dns" containerID="cri-o://1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec" gracePeriod=10 Dec 03 07:00:57 crc kubenswrapper[4475]: I1203 07:00:57.393342 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bk4n9" podUID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerName="registry-server" containerID="cri-o://ce8ded9ff003a96e9ac2b30b683a2f4293fc745bb0ca965930fe87243b626b07" gracePeriod=2 Dec 03 07:00:57 crc kubenswrapper[4475]: I1203 07:00:57.479127 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:00:57 crc kubenswrapper[4475]: I1203 07:00:57.959308 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.050248 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-svc\") pod \"f1be585f-cb41-49eb-83b7-9c25c757c739\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.050313 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-config\") pod \"f1be585f-cb41-49eb-83b7-9c25c757c739\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.050336 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-nb\") pod \"f1be585f-cb41-49eb-83b7-9c25c757c739\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.050357 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq7m2\" (UniqueName: \"kubernetes.io/projected/f1be585f-cb41-49eb-83b7-9c25c757c739-kube-api-access-nq7m2\") pod \"f1be585f-cb41-49eb-83b7-9c25c757c739\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.050402 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-swift-storage-0\") pod \"f1be585f-cb41-49eb-83b7-9c25c757c739\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.050440 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-sb\") pod \"f1be585f-cb41-49eb-83b7-9c25c757c739\" (UID: \"f1be585f-cb41-49eb-83b7-9c25c757c739\") " Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.082612 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1be585f-cb41-49eb-83b7-9c25c757c739-kube-api-access-nq7m2" (OuterVolumeSpecName: "kube-api-access-nq7m2") pod "f1be585f-cb41-49eb-83b7-9c25c757c739" (UID: "f1be585f-cb41-49eb-83b7-9c25c757c739"). InnerVolumeSpecName "kube-api-access-nq7m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.128947 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1be585f-cb41-49eb-83b7-9c25c757c739" (UID: "f1be585f-cb41-49eb-83b7-9c25c757c739"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.146036 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1be585f-cb41-49eb-83b7-9c25c757c739" (UID: "f1be585f-cb41-49eb-83b7-9c25c757c739"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.154426 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.154463 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.154472 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq7m2\" (UniqueName: \"kubernetes.io/projected/f1be585f-cb41-49eb-83b7-9c25c757c739-kube-api-access-nq7m2\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.156611 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f1be585f-cb41-49eb-83b7-9c25c757c739" (UID: "f1be585f-cb41-49eb-83b7-9c25c757c739"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.189559 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1be585f-cb41-49eb-83b7-9c25c757c739" (UID: "f1be585f-cb41-49eb-83b7-9c25c757c739"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.189895 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-config" (OuterVolumeSpecName: "config") pod "f1be585f-cb41-49eb-83b7-9c25c757c739" (UID: "f1be585f-cb41-49eb-83b7-9c25c757c739"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.256162 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.256193 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.256203 4475 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1be585f-cb41-49eb-83b7-9c25c757c739-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.404666 4475 generic.go:334] "Generic (PLEG): container finished" podID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerID="ce8ded9ff003a96e9ac2b30b683a2f4293fc745bb0ca965930fe87243b626b07" exitCode=0 Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.404733 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk4n9" event={"ID":"cd709367-66b2-4586-b3ba-424d4c1532ee","Type":"ContainerDied","Data":"ce8ded9ff003a96e9ac2b30b683a2f4293fc745bb0ca965930fe87243b626b07"} Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.406328 4475 generic.go:334] "Generic (PLEG): container finished" podID="f1be585f-cb41-49eb-83b7-9c25c757c739" containerID="1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec" exitCode=0 Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.406503 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="11718fbb-3775-43f1-bd7f-d1ca4ad5800f" containerName="cinder-scheduler" containerID="cri-o://e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785" gracePeriod=30 Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.406739 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.428837 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" event={"ID":"f1be585f-cb41-49eb-83b7-9c25c757c739","Type":"ContainerDied","Data":"1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec"} Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.428885 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7bf79b95-knv75" event={"ID":"f1be585f-cb41-49eb-83b7-9c25c757c739","Type":"ContainerDied","Data":"ff9956979f7edeb7e5276a5a737ccca2b4a4b60a38a8ef787caee182c2db46d2"} Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.428905 4475 scope.go:117] "RemoveContainer" containerID="1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.429251 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="11718fbb-3775-43f1-bd7f-d1ca4ad5800f" containerName="probe" containerID="cri-o://cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e" gracePeriod=30 Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.481531 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7bf79b95-knv75"] Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.503596 4475 scope.go:117] "RemoveContainer" containerID="69018c9dad70106cc7bbbad69455a45b3a59a6fa572c73a1e224fda5f46821cc" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.507111 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c7bf79b95-knv75"] Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.598159 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.605161 4475 scope.go:117] "RemoveContainer" containerID="1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec" Dec 03 07:00:58 crc kubenswrapper[4475]: E1203 07:00:58.606418 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec\": container with ID starting with 1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec not found: ID does not exist" containerID="1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.606475 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec"} err="failed to get container status \"1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec\": rpc error: code = NotFound desc = could not find container \"1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec\": container with ID starting with 1278a78e4bf596a21b64a805c5b53ce6119440830954781498d74ad20293e6ec not found: ID does not exist" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.606497 4475 scope.go:117] "RemoveContainer" containerID="69018c9dad70106cc7bbbad69455a45b3a59a6fa572c73a1e224fda5f46821cc" Dec 03 07:00:58 crc kubenswrapper[4475]: E1203 07:00:58.606888 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69018c9dad70106cc7bbbad69455a45b3a59a6fa572c73a1e224fda5f46821cc\": container with ID starting with 69018c9dad70106cc7bbbad69455a45b3a59a6fa572c73a1e224fda5f46821cc not found: ID does not exist" containerID="69018c9dad70106cc7bbbad69455a45b3a59a6fa572c73a1e224fda5f46821cc" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.606921 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69018c9dad70106cc7bbbad69455a45b3a59a6fa572c73a1e224fda5f46821cc"} err="failed to get container status \"69018c9dad70106cc7bbbad69455a45b3a59a6fa572c73a1e224fda5f46821cc\": rpc error: code = NotFound desc = could not find container \"69018c9dad70106cc7bbbad69455a45b3a59a6fa572c73a1e224fda5f46821cc\": container with ID starting with 69018c9dad70106cc7bbbad69455a45b3a59a6fa572c73a1e224fda5f46821cc not found: ID does not exist" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.622138 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.661927 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr2x4\" (UniqueName: \"kubernetes.io/projected/cd709367-66b2-4586-b3ba-424d4c1532ee-kube-api-access-nr2x4\") pod \"cd709367-66b2-4586-b3ba-424d4c1532ee\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.662015 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-utilities\") pod \"cd709367-66b2-4586-b3ba-424d4c1532ee\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.662147 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-catalog-content\") pod \"cd709367-66b2-4586-b3ba-424d4c1532ee\" (UID: \"cd709367-66b2-4586-b3ba-424d4c1532ee\") " Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.663438 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-utilities" (OuterVolumeSpecName: "utilities") pod "cd709367-66b2-4586-b3ba-424d4c1532ee" (UID: "cd709367-66b2-4586-b3ba-424d4c1532ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.699905 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd709367-66b2-4586-b3ba-424d4c1532ee-kube-api-access-nr2x4" (OuterVolumeSpecName: "kube-api-access-nr2x4") pod "cd709367-66b2-4586-b3ba-424d4c1532ee" (UID: "cd709367-66b2-4586-b3ba-424d4c1532ee"). InnerVolumeSpecName "kube-api-access-nr2x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.764463 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr2x4\" (UniqueName: \"kubernetes.io/projected/cd709367-66b2-4586-b3ba-424d4c1532ee-kube-api-access-nr2x4\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.764566 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.773460 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd709367-66b2-4586-b3ba-424d4c1532ee" (UID: "cd709367-66b2-4586-b3ba-424d4c1532ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.852818 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 07:00:58 crc kubenswrapper[4475]: I1203 07:00:58.865866 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd709367-66b2-4586-b3ba-424d4c1532ee-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:59 crc kubenswrapper[4475]: I1203 07:00:59.418842 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk4n9" event={"ID":"cd709367-66b2-4586-b3ba-424d4c1532ee","Type":"ContainerDied","Data":"7156319f63f2ea4fbb7acd862764bb6b088a47ff06d9e5d9c4af50d1d1a7939c"} Dec 03 07:00:59 crc kubenswrapper[4475]: I1203 07:00:59.418862 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk4n9" Dec 03 07:00:59 crc kubenswrapper[4475]: I1203 07:00:59.419132 4475 scope.go:117] "RemoveContainer" containerID="ce8ded9ff003a96e9ac2b30b683a2f4293fc745bb0ca965930fe87243b626b07" Dec 03 07:00:59 crc kubenswrapper[4475]: I1203 07:00:59.426023 4475 generic.go:334] "Generic (PLEG): container finished" podID="11718fbb-3775-43f1-bd7f-d1ca4ad5800f" containerID="cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e" exitCode=0 Dec 03 07:00:59 crc kubenswrapper[4475]: I1203 07:00:59.426061 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11718fbb-3775-43f1-bd7f-d1ca4ad5800f","Type":"ContainerDied","Data":"cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e"} Dec 03 07:00:59 crc kubenswrapper[4475]: I1203 07:00:59.472496 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bk4n9"] Dec 03 07:00:59 crc kubenswrapper[4475]: I1203 07:00:59.486997 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bk4n9"] Dec 03 07:00:59 crc kubenswrapper[4475]: I1203 07:00:59.507957 4475 scope.go:117] "RemoveContainer" containerID="fd967aeb76da537671c4a993553a49caeb764bac88e302799f05bec146346a03" Dec 03 07:00:59 crc kubenswrapper[4475]: I1203 07:00:59.513657 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd709367-66b2-4586-b3ba-424d4c1532ee" path="/var/lib/kubelet/pods/cd709367-66b2-4586-b3ba-424d4c1532ee/volumes" Dec 03 07:00:59 crc kubenswrapper[4475]: I1203 07:00:59.514521 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1be585f-cb41-49eb-83b7-9c25c757c739" path="/var/lib/kubelet/pods/f1be585f-cb41-49eb-83b7-9c25c757c739/volumes" Dec 03 07:00:59 crc kubenswrapper[4475]: I1203 07:00:59.532952 4475 scope.go:117] "RemoveContainer" containerID="ab67e3437a2127861169da71e13cbf727667933c83585ef921517ca855c8e2c9" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.011814 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.129508 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412421-4bc7f"] Dec 03 07:01:00 crc kubenswrapper[4475]: E1203 07:01:00.129811 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerName="extract-utilities" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.129827 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerName="extract-utilities" Dec 03 07:01:00 crc kubenswrapper[4475]: E1203 07:01:00.129852 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1be585f-cb41-49eb-83b7-9c25c757c739" containerName="dnsmasq-dns" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.129857 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1be585f-cb41-49eb-83b7-9c25c757c739" containerName="dnsmasq-dns" Dec 03 07:01:00 crc kubenswrapper[4475]: E1203 07:01:00.129866 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerName="extract-content" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.129871 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerName="extract-content" Dec 03 07:01:00 crc kubenswrapper[4475]: E1203 07:01:00.129883 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1be585f-cb41-49eb-83b7-9c25c757c739" containerName="init" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.129888 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1be585f-cb41-49eb-83b7-9c25c757c739" containerName="init" Dec 03 07:01:00 crc kubenswrapper[4475]: E1203 07:01:00.129901 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e786a238-51fe-464f-bcc8-54d35b24e9cf" containerName="horizon-log" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.129906 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e786a238-51fe-464f-bcc8-54d35b24e9cf" containerName="horizon-log" Dec 03 07:01:00 crc kubenswrapper[4475]: E1203 07:01:00.129913 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e786a238-51fe-464f-bcc8-54d35b24e9cf" containerName="horizon" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.129918 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e786a238-51fe-464f-bcc8-54d35b24e9cf" containerName="horizon" Dec 03 07:01:00 crc kubenswrapper[4475]: E1203 07:01:00.129928 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerName="registry-server" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.129934 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerName="registry-server" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.130088 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="e786a238-51fe-464f-bcc8-54d35b24e9cf" containerName="horizon" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.130105 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1be585f-cb41-49eb-83b7-9c25c757c739" containerName="dnsmasq-dns" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.130113 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd709367-66b2-4586-b3ba-424d4c1532ee" containerName="registry-server" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.130120 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="e786a238-51fe-464f-bcc8-54d35b24e9cf" containerName="horizon-log" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.130645 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.137918 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412421-4bc7f"] Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.196356 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-config-data\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.196444 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-combined-ca-bundle\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.196589 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-fernet-keys\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.196755 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5v6v\" (UniqueName: \"kubernetes.io/projected/3f477cd8-8d33-4a1f-873c-ffde6c87149e-kube-api-access-n5v6v\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.298184 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-combined-ca-bundle\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.298251 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-fernet-keys\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.298308 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5v6v\" (UniqueName: \"kubernetes.io/projected/3f477cd8-8d33-4a1f-873c-ffde6c87149e-kube-api-access-n5v6v\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.298366 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-config-data\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.303151 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-combined-ca-bundle\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.306251 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-config-data\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.307025 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-fernet-keys\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.320015 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5v6v\" (UniqueName: \"kubernetes.io/projected/3f477cd8-8d33-4a1f-873c-ffde6c87149e-kube-api-access-n5v6v\") pod \"keystone-cron-29412421-4bc7f\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.445918 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.812282 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:01:00 crc kubenswrapper[4475]: I1203 07:01:00.994022 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412421-4bc7f"] Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.061294 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.119251 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data-custom\") pod \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.119468 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-combined-ca-bundle\") pod \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.119526 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-etc-machine-id\") pod \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.119574 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-scripts\") pod \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.119609 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgbkn\" (UniqueName: \"kubernetes.io/projected/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-kube-api-access-rgbkn\") pod \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.119705 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data\") pod \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\" (UID: \"11718fbb-3775-43f1-bd7f-d1ca4ad5800f\") " Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.120918 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "11718fbb-3775-43f1-bd7f-d1ca4ad5800f" (UID: "11718fbb-3775-43f1-bd7f-d1ca4ad5800f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.150771 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-scripts" (OuterVolumeSpecName: "scripts") pod "11718fbb-3775-43f1-bd7f-d1ca4ad5800f" (UID: "11718fbb-3775-43f1-bd7f-d1ca4ad5800f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.150996 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "11718fbb-3775-43f1-bd7f-d1ca4ad5800f" (UID: "11718fbb-3775-43f1-bd7f-d1ca4ad5800f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.158925 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-kube-api-access-rgbkn" (OuterVolumeSpecName: "kube-api-access-rgbkn") pod "11718fbb-3775-43f1-bd7f-d1ca4ad5800f" (UID: "11718fbb-3775-43f1-bd7f-d1ca4ad5800f"). InnerVolumeSpecName "kube-api-access-rgbkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.187271 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11718fbb-3775-43f1-bd7f-d1ca4ad5800f" (UID: "11718fbb-3775-43f1-bd7f-d1ca4ad5800f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.221164 4475 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.221189 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.221201 4475 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.221209 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.221218 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgbkn\" (UniqueName: \"kubernetes.io/projected/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-kube-api-access-rgbkn\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.246567 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data" (OuterVolumeSpecName: "config-data") pod "11718fbb-3775-43f1-bd7f-d1ca4ad5800f" (UID: "11718fbb-3775-43f1-bd7f-d1ca4ad5800f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.297858 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.322900 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11718fbb-3775-43f1-bd7f-d1ca4ad5800f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.414271 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-695fd7c4bb-h85zh" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.444589 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412421-4bc7f" event={"ID":"3f477cd8-8d33-4a1f-873c-ffde6c87149e","Type":"ContainerStarted","Data":"b3a970bae80e8e641feab30506b9e3514c12050a3d6c5e368d0ca85e3c6f5b59"} Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.445051 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412421-4bc7f" event={"ID":"3f477cd8-8d33-4a1f-873c-ffde6c87149e","Type":"ContainerStarted","Data":"a255340f9cb413a299d24f243165a58025c75cb5f2ea5f1f3515e732358d8648"} Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.446500 4475 generic.go:334] "Generic (PLEG): container finished" podID="11718fbb-3775-43f1-bd7f-d1ca4ad5800f" containerID="e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785" exitCode=0 Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.446529 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11718fbb-3775-43f1-bd7f-d1ca4ad5800f","Type":"ContainerDied","Data":"e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785"} Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.446545 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"11718fbb-3775-43f1-bd7f-d1ca4ad5800f","Type":"ContainerDied","Data":"8b9223166148769c287778289f2bc7f4f701bff862557152cf85cefee0a07700"} Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.446560 4475 scope.go:117] "RemoveContainer" containerID="cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.446640 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.467409 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7fc4d79b88-s8hhg"] Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.467588 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7fc4d79b88-s8hhg" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon-log" containerID="cri-o://73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02" gracePeriod=30 Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.467698 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7fc4d79b88-s8hhg" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon" containerID="cri-o://7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f" gracePeriod=30 Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.483442 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412421-4bc7f" podStartSLOduration=1.4834272020000001 podStartE2EDuration="1.483427202s" podCreationTimestamp="2025-12-03 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:01.477357774 +0000 UTC m=+946.282256109" watchObservedRunningTime="2025-12-03 07:01:01.483427202 +0000 UTC m=+946.288325536" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.499568 4475 scope.go:117] "RemoveContainer" containerID="e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.532852 4475 scope.go:117] "RemoveContainer" containerID="cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e" Dec 03 07:01:01 crc kubenswrapper[4475]: E1203 07:01:01.547294 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e\": container with ID starting with cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e not found: ID does not exist" containerID="cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.547620 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e"} err="failed to get container status \"cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e\": rpc error: code = NotFound desc = could not find container \"cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e\": container with ID starting with cc2bc9e356ba4004e9c450a64787a440b294b4548484666184e2df356fa73f9e not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.547717 4475 scope.go:117] "RemoveContainer" containerID="e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785" Dec 03 07:01:01 crc kubenswrapper[4475]: E1203 07:01:01.548697 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785\": container with ID starting with e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785 not found: ID does not exist" containerID="e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.548808 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785"} err="failed to get container status \"e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785\": rpc error: code = NotFound desc = could not find container \"e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785\": container with ID starting with e1061c69f6fd72bac2694e4e9a5817d79122eb37efffb33a91f8faf43106f785 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.549063 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.549096 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.553047 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:01:01 crc kubenswrapper[4475]: E1203 07:01:01.553378 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11718fbb-3775-43f1-bd7f-d1ca4ad5800f" containerName="cinder-scheduler" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.553391 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="11718fbb-3775-43f1-bd7f-d1ca4ad5800f" containerName="cinder-scheduler" Dec 03 07:01:01 crc kubenswrapper[4475]: E1203 07:01:01.553418 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11718fbb-3775-43f1-bd7f-d1ca4ad5800f" containerName="probe" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.553425 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="11718fbb-3775-43f1-bd7f-d1ca4ad5800f" containerName="probe" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.553660 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="11718fbb-3775-43f1-bd7f-d1ca4ad5800f" containerName="probe" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.553677 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="11718fbb-3775-43f1-bd7f-d1ca4ad5800f" containerName="cinder-scheduler" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.554469 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.557275 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.580207 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.627628 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.627673 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.627717 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn5pk\" (UniqueName: \"kubernetes.io/projected/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-kube-api-access-pn5pk\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.627740 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-scripts\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.627759 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-config-data\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.627800 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.729609 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.729656 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.729696 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn5pk\" (UniqueName: \"kubernetes.io/projected/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-kube-api-access-pn5pk\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.729719 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-scripts\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.729745 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-config-data\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.729810 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.729962 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.736227 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-scripts\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.736298 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.736889 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.739200 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-config-data\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.744169 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn5pk\" (UniqueName: \"kubernetes.io/projected/c0228bec-bc1d-4676-bd15-4cf4bdfa604a-kube-api-access-pn5pk\") pod \"cinder-scheduler-0\" (UID: \"c0228bec-bc1d-4676-bd15-4cf4bdfa604a\") " pod="openstack/cinder-scheduler-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.775877 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 07:01:01 crc kubenswrapper[4475]: I1203 07:01:01.876713 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:01:02 crc kubenswrapper[4475]: I1203 07:01:02.441427 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:01:02 crc kubenswrapper[4475]: W1203 07:01:02.442782 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0228bec_bc1d_4676_bd15_4cf4bdfa604a.slice/crio-ba245a34a428e74db9d47826a058eb47b404ba3ce3d50b7d1def04a8cb1ac996 WatchSource:0}: Error finding container ba245a34a428e74db9d47826a058eb47b404ba3ce3d50b7d1def04a8cb1ac996: Status 404 returned error can't find the container with id ba245a34a428e74db9d47826a058eb47b404ba3ce3d50b7d1def04a8cb1ac996 Dec 03 07:01:02 crc kubenswrapper[4475]: I1203 07:01:02.455637 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c0228bec-bc1d-4676-bd15-4cf4bdfa604a","Type":"ContainerStarted","Data":"ba245a34a428e74db9d47826a058eb47b404ba3ce3d50b7d1def04a8cb1ac996"} Dec 03 07:01:03 crc kubenswrapper[4475]: I1203 07:01:03.474702 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c0228bec-bc1d-4676-bd15-4cf4bdfa604a","Type":"ContainerStarted","Data":"960dc0ba468eb971c7b433eef02fdbca29b7a02c71c41e6eae3ba647cdd9335d"} Dec 03 07:01:03 crc kubenswrapper[4475]: I1203 07:01:03.502771 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11718fbb-3775-43f1-bd7f-d1ca4ad5800f" path="/var/lib/kubelet/pods/11718fbb-3775-43f1-bd7f-d1ca4ad5800f/volumes" Dec 03 07:01:04 crc kubenswrapper[4475]: I1203 07:01:04.483518 4475 generic.go:334] "Generic (PLEG): container finished" podID="3f477cd8-8d33-4a1f-873c-ffde6c87149e" containerID="b3a970bae80e8e641feab30506b9e3514c12050a3d6c5e368d0ca85e3c6f5b59" exitCode=0 Dec 03 07:01:04 crc kubenswrapper[4475]: I1203 07:01:04.483600 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412421-4bc7f" event={"ID":"3f477cd8-8d33-4a1f-873c-ffde6c87149e","Type":"ContainerDied","Data":"b3a970bae80e8e641feab30506b9e3514c12050a3d6c5e368d0ca85e3c6f5b59"} Dec 03 07:01:04 crc kubenswrapper[4475]: I1203 07:01:04.485555 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c0228bec-bc1d-4676-bd15-4cf4bdfa604a","Type":"ContainerStarted","Data":"f7e6df7a6f4fe065041c2fd6bc360b8df9b9147ddd52afe26c364735f73a9062"} Dec 03 07:01:04 crc kubenswrapper[4475]: I1203 07:01:04.518927 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.518910846 podStartE2EDuration="3.518910846s" podCreationTimestamp="2025-12-03 07:01:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:04.51427176 +0000 UTC m=+949.319170094" watchObservedRunningTime="2025-12-03 07:01:04.518910846 +0000 UTC m=+949.323809180" Dec 03 07:01:04 crc kubenswrapper[4475]: I1203 07:01:04.825207 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fc4d79b88-s8hhg" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 03 07:01:05 crc kubenswrapper[4475]: I1203 07:01:05.494898 4475 generic.go:334] "Generic (PLEG): container finished" podID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerID="7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f" exitCode=0 Dec 03 07:01:05 crc kubenswrapper[4475]: I1203 07:01:05.499923 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fc4d79b88-s8hhg" event={"ID":"2401beb9-38b8-4581-b9a2-8bb16e15e6c1","Type":"ContainerDied","Data":"7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f"} Dec 03 07:01:05 crc kubenswrapper[4475]: I1203 07:01:05.955708 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.004373 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5v6v\" (UniqueName: \"kubernetes.io/projected/3f477cd8-8d33-4a1f-873c-ffde6c87149e-kube-api-access-n5v6v\") pod \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.004688 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-config-data\") pod \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.004892 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-fernet-keys\") pod \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.004993 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-combined-ca-bundle\") pod \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\" (UID: \"3f477cd8-8d33-4a1f-873c-ffde6c87149e\") " Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.009849 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f477cd8-8d33-4a1f-873c-ffde6c87149e-kube-api-access-n5v6v" (OuterVolumeSpecName: "kube-api-access-n5v6v") pod "3f477cd8-8d33-4a1f-873c-ffde6c87149e" (UID: "3f477cd8-8d33-4a1f-873c-ffde6c87149e"). InnerVolumeSpecName "kube-api-access-n5v6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.037639 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3f477cd8-8d33-4a1f-873c-ffde6c87149e" (UID: "3f477cd8-8d33-4a1f-873c-ffde6c87149e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.083709 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.101585 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f477cd8-8d33-4a1f-873c-ffde6c87149e" (UID: "3f477cd8-8d33-4a1f-873c-ffde6c87149e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.101708 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-config-data" (OuterVolumeSpecName: "config-data") pod "3f477cd8-8d33-4a1f-873c-ffde6c87149e" (UID: "3f477cd8-8d33-4a1f-873c-ffde6c87149e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.122643 4475 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.122671 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.122684 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5v6v\" (UniqueName: \"kubernetes.io/projected/3f477cd8-8d33-4a1f-873c-ffde6c87149e-kube-api-access-n5v6v\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.122699 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f477cd8-8d33-4a1f-873c-ffde6c87149e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.199165 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-97b4f9d66-l5knv" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.205391 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c97b994bb-77qkf" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.280055 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-74bf7b4cc4-lpw7v"] Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.280505 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" podUID="c455da32-2255-4caa-8734-33f188430025" containerName="barbican-api-log" containerID="cri-o://0364d7b9118aa91002ebda52e2150fab9170d02391fd185f2ba25f286a490497" gracePeriod=30 Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.280881 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" podUID="c455da32-2255-4caa-8734-33f188430025" containerName="barbican-api" containerID="cri-o://2f9e3b1825a5d1e4ce1abb4a22ecca2925405b28ccc555a7a84ba992149762c3" gracePeriod=30 Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.442562 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.448898 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5fbfc8dd66-9z9vj" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.516820 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412421-4bc7f" event={"ID":"3f477cd8-8d33-4a1f-873c-ffde6c87149e","Type":"ContainerDied","Data":"a255340f9cb413a299d24f243165a58025c75cb5f2ea5f1f3515e732358d8648"} Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.516856 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a255340f9cb413a299d24f243165a58025c75cb5f2ea5f1f3515e732358d8648" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.516934 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412421-4bc7f" Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.527890 4475 generic.go:334] "Generic (PLEG): container finished" podID="c455da32-2255-4caa-8734-33f188430025" containerID="0364d7b9118aa91002ebda52e2150fab9170d02391fd185f2ba25f286a490497" exitCode=143 Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.527956 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" event={"ID":"c455da32-2255-4caa-8734-33f188430025","Type":"ContainerDied","Data":"0364d7b9118aa91002ebda52e2150fab9170d02391fd185f2ba25f286a490497"} Dec 03 07:01:06 crc kubenswrapper[4475]: I1203 07:01:06.876872 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.478260 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" podUID="c455da32-2255-4caa-8734-33f188430025" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:54884->10.217.0.163:9311: read: connection reset by peer" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.478291 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" podUID="c455da32-2255-4caa-8734-33f188430025" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:54882->10.217.0.163:9311: read: connection reset by peer" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.549664 4475 generic.go:334] "Generic (PLEG): container finished" podID="c455da32-2255-4caa-8734-33f188430025" containerID="2f9e3b1825a5d1e4ce1abb4a22ecca2925405b28ccc555a7a84ba992149762c3" exitCode=0 Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.549699 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" event={"ID":"c455da32-2255-4caa-8734-33f188430025","Type":"ContainerDied","Data":"2f9e3b1825a5d1e4ce1abb4a22ecca2925405b28ccc555a7a84ba992149762c3"} Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.651513 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 07:01:09 crc kubenswrapper[4475]: E1203 07:01:09.652004 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f477cd8-8d33-4a1f-873c-ffde6c87149e" containerName="keystone-cron" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.652029 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f477cd8-8d33-4a1f-873c-ffde6c87149e" containerName="keystone-cron" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.652217 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f477cd8-8d33-4a1f-873c-ffde6c87149e" containerName="keystone-cron" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.652756 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.661027 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-52lrr" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.661206 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.661426 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.670108 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.691750 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config-secret\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.691794 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.691819 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.691897 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l45pv\" (UniqueName: \"kubernetes.io/projected/ad1310ef-4548-4b61-8cdb-2a946873fb9c-kube-api-access-l45pv\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.793290 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l45pv\" (UniqueName: \"kubernetes.io/projected/ad1310ef-4548-4b61-8cdb-2a946873fb9c-kube-api-access-l45pv\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.793374 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config-secret\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.793398 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.793418 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.794262 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.799230 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.807536 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config-secret\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.815947 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l45pv\" (UniqueName: \"kubernetes.io/projected/ad1310ef-4548-4b61-8cdb-2a946873fb9c-kube-api-access-l45pv\") pod \"openstackclient\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.824768 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.825055 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.840162 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.869534 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.870595 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.888486 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.899925 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.900389 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-openstack-config-secret\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.900464 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.900521 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-openstack-config\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: I1203 07:01:09.900557 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl8gw\" (UniqueName: \"kubernetes.io/projected/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-kube-api-access-cl8gw\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:09 crc kubenswrapper[4475]: E1203 07:01:09.985167 4475 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 07:01:09 crc kubenswrapper[4475]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_ad1310ef-4548-4b61-8cdb-2a946873fb9c_0(a574a6b636cc563965d8189001f404341fd1affa1ffd50dfce408eb7c482e3dd): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a574a6b636cc563965d8189001f404341fd1affa1ffd50dfce408eb7c482e3dd" Netns:"/var/run/netns/5cc4a51e-801b-4ee8-89aa-ed40ef473c6c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=a574a6b636cc563965d8189001f404341fd1affa1ffd50dfce408eb7c482e3dd;K8S_POD_UID=ad1310ef-4548-4b61-8cdb-2a946873fb9c" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/ad1310ef-4548-4b61-8cdb-2a946873fb9c]: expected pod UID "ad1310ef-4548-4b61-8cdb-2a946873fb9c" but got "bc0a891f-83ae-4592-ac88-d4bc4359b4d9" from Kube API Dec 03 07:01:09 crc kubenswrapper[4475]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 07:01:09 crc kubenswrapper[4475]: > Dec 03 07:01:09 crc kubenswrapper[4475]: E1203 07:01:09.985244 4475 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 07:01:09 crc kubenswrapper[4475]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_ad1310ef-4548-4b61-8cdb-2a946873fb9c_0(a574a6b636cc563965d8189001f404341fd1affa1ffd50dfce408eb7c482e3dd): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a574a6b636cc563965d8189001f404341fd1affa1ffd50dfce408eb7c482e3dd" Netns:"/var/run/netns/5cc4a51e-801b-4ee8-89aa-ed40ef473c6c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=a574a6b636cc563965d8189001f404341fd1affa1ffd50dfce408eb7c482e3dd;K8S_POD_UID=ad1310ef-4548-4b61-8cdb-2a946873fb9c" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/ad1310ef-4548-4b61-8cdb-2a946873fb9c]: expected pod UID "ad1310ef-4548-4b61-8cdb-2a946873fb9c" but got "bc0a891f-83ae-4592-ac88-d4bc4359b4d9" from Kube API Dec 03 07:01:09 crc kubenswrapper[4475]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 07:01:09 crc kubenswrapper[4475]: > pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.003375 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data\") pod \"c455da32-2255-4caa-8734-33f188430025\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.003500 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-combined-ca-bundle\") pod \"c455da32-2255-4caa-8734-33f188430025\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.003538 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvwv6\" (UniqueName: \"kubernetes.io/projected/c455da32-2255-4caa-8734-33f188430025-kube-api-access-mvwv6\") pod \"c455da32-2255-4caa-8734-33f188430025\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.003616 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data-custom\") pod \"c455da32-2255-4caa-8734-33f188430025\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.003696 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c455da32-2255-4caa-8734-33f188430025-logs\") pod \"c455da32-2255-4caa-8734-33f188430025\" (UID: \"c455da32-2255-4caa-8734-33f188430025\") " Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.004024 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-openstack-config-secret\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.004054 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.004096 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-openstack-config\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.004121 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl8gw\" (UniqueName: \"kubernetes.io/projected/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-kube-api-access-cl8gw\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.006193 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-openstack-config\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.006836 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c455da32-2255-4caa-8734-33f188430025-logs" (OuterVolumeSpecName: "logs") pod "c455da32-2255-4caa-8734-33f188430025" (UID: "c455da32-2255-4caa-8734-33f188430025"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.009245 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-openstack-config-secret\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.010989 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c455da32-2255-4caa-8734-33f188430025" (UID: "c455da32-2255-4caa-8734-33f188430025"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.011129 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c455da32-2255-4caa-8734-33f188430025-kube-api-access-mvwv6" (OuterVolumeSpecName: "kube-api-access-mvwv6") pod "c455da32-2255-4caa-8734-33f188430025" (UID: "c455da32-2255-4caa-8734-33f188430025"). InnerVolumeSpecName "kube-api-access-mvwv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.012809 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.023980 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl8gw\" (UniqueName: \"kubernetes.io/projected/bc0a891f-83ae-4592-ac88-d4bc4359b4d9-kube-api-access-cl8gw\") pod \"openstackclient\" (UID: \"bc0a891f-83ae-4592-ac88-d4bc4359b4d9\") " pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.035234 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c455da32-2255-4caa-8734-33f188430025" (UID: "c455da32-2255-4caa-8734-33f188430025"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.059148 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data" (OuterVolumeSpecName: "config-data") pod "c455da32-2255-4caa-8734-33f188430025" (UID: "c455da32-2255-4caa-8734-33f188430025"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.106115 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.106145 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.106158 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvwv6\" (UniqueName: \"kubernetes.io/projected/c455da32-2255-4caa-8734-33f188430025-kube-api-access-mvwv6\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.106166 4475 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c455da32-2255-4caa-8734-33f188430025-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.106174 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c455da32-2255-4caa-8734-33f188430025-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.219691 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.540561 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-74fdff459b-tj7xb"] Dec 03 07:01:10 crc kubenswrapper[4475]: E1203 07:01:10.540877 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c455da32-2255-4caa-8734-33f188430025" containerName="barbican-api" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.540890 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c455da32-2255-4caa-8734-33f188430025" containerName="barbican-api" Dec 03 07:01:10 crc kubenswrapper[4475]: E1203 07:01:10.540901 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c455da32-2255-4caa-8734-33f188430025" containerName="barbican-api-log" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.540907 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c455da32-2255-4caa-8734-33f188430025" containerName="barbican-api-log" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.541074 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="c455da32-2255-4caa-8734-33f188430025" containerName="barbican-api" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.541092 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="c455da32-2255-4caa-8734-33f188430025" containerName="barbican-api-log" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.541607 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.543781 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-ld5mr" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.543991 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.544210 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.559610 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-74fdff459b-tj7xb"] Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.586798 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-58d4897ff7-tjzk8"] Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.588004 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.593655 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.595969 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.596093 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.597198 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74bf7b4cc4-lpw7v" event={"ID":"c455da32-2255-4caa-8734-33f188430025","Type":"ContainerDied","Data":"2143e919960089e8af4dbf03aab11a09ee9e49002439708ae9c9c6f106a452f7"} Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.597733 4475 scope.go:117] "RemoveContainer" containerID="2f9e3b1825a5d1e4ce1abb4a22ecca2925405b28ccc555a7a84ba992149762c3" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.616329 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.616387 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data-custom\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.616437 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data-custom\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.616479 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.616580 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5wdr\" (UniqueName: \"kubernetes.io/projected/6b5cf0ec-4d62-4de9-ad66-f758113443e7-kube-api-access-j5wdr\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.616600 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s97h\" (UniqueName: \"kubernetes.io/projected/47298c15-f6d2-463f-a97b-1b4d63999b81-kube-api-access-9s97h\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.616651 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-combined-ca-bundle\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.616771 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-combined-ca-bundle\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.624222 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.637949 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58d4897ff7-tjzk8"] Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.659279 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.662540 4475 scope.go:117] "RemoveContainer" containerID="0364d7b9118aa91002ebda52e2150fab9170d02391fd185f2ba25f286a490497" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.665223 4475 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ad1310ef-4548-4b61-8cdb-2a946873fb9c" podUID="bc0a891f-83ae-4592-ac88-d4bc4359b4d9" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.706516 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-664766cd5c-v6774"] Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.717578 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l45pv\" (UniqueName: \"kubernetes.io/projected/ad1310ef-4548-4b61-8cdb-2a946873fb9c-kube-api-access-l45pv\") pod \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.717660 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config\") pod \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.717735 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config-secret\") pod \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.717822 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-combined-ca-bundle\") pod \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\" (UID: \"ad1310ef-4548-4b61-8cdb-2a946873fb9c\") " Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.718088 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-combined-ca-bundle\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.718130 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.718149 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data-custom\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.718195 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data-custom\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.718218 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.718290 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5wdr\" (UniqueName: \"kubernetes.io/projected/6b5cf0ec-4d62-4de9-ad66-f758113443e7-kube-api-access-j5wdr\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.718310 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s97h\" (UniqueName: \"kubernetes.io/projected/47298c15-f6d2-463f-a97b-1b4d63999b81-kube-api-access-9s97h\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.718349 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-combined-ca-bundle\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.724773 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.739964 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.740617 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ad1310ef-4548-4b61-8cdb-2a946873fb9c" (UID: "ad1310ef-4548-4b61-8cdb-2a946873fb9c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.745510 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-74bf7b4cc4-lpw7v"] Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.752414 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.755267 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data-custom\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.760558 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ad1310ef-4548-4b61-8cdb-2a946873fb9c" (UID: "ad1310ef-4548-4b61-8cdb-2a946873fb9c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.760655 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad1310ef-4548-4b61-8cdb-2a946873fb9c" (UID: "ad1310ef-4548-4b61-8cdb-2a946873fb9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.766050 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad1310ef-4548-4b61-8cdb-2a946873fb9c-kube-api-access-l45pv" (OuterVolumeSpecName: "kube-api-access-l45pv") pod "ad1310ef-4548-4b61-8cdb-2a946873fb9c" (UID: "ad1310ef-4548-4b61-8cdb-2a946873fb9c"). InnerVolumeSpecName "kube-api-access-l45pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.766608 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-combined-ca-bundle\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.766805 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data-custom\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.767187 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-combined-ca-bundle\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.774488 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s97h\" (UniqueName: \"kubernetes.io/projected/47298c15-f6d2-463f-a97b-1b4d63999b81-kube-api-access-9s97h\") pod \"heat-engine-74fdff459b-tj7xb\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.790703 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-74bf7b4cc4-lpw7v"] Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.805734 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-664766cd5c-v6774"] Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.811685 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6c87685856-7spht"] Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.812945 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.823641 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6c87685856-7spht"] Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.824586 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-nb\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.824696 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjhh\" (UniqueName: \"kubernetes.io/projected/f5472efc-49e7-4b70-9084-da3a969617ab-kube-api-access-hkjhh\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.825132 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.825684 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-swift-storage-0\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.825724 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-svc\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.825766 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-sb\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.825845 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-config\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.825905 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l45pv\" (UniqueName: \"kubernetes.io/projected/ad1310ef-4548-4b61-8cdb-2a946873fb9c-kube-api-access-l45pv\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.825919 4475 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.825928 4475 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.825937 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad1310ef-4548-4b61-8cdb-2a946873fb9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.840377 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5wdr\" (UniqueName: \"kubernetes.io/projected/6b5cf0ec-4d62-4de9-ad66-f758113443e7-kube-api-access-j5wdr\") pod \"heat-cfnapi-58d4897ff7-tjzk8\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.873529 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.922527 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.927640 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.927717 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-swift-storage-0\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.927743 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-svc\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.927768 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-sb\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.927809 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-config\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.927832 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-nb\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.927860 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-combined-ca-bundle\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.927884 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4qjk\" (UniqueName: \"kubernetes.io/projected/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-kube-api-access-x4qjk\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.927912 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjhh\" (UniqueName: \"kubernetes.io/projected/f5472efc-49e7-4b70-9084-da3a969617ab-kube-api-access-hkjhh\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.927939 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data-custom\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.928848 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-nb\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.929043 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-config\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.929650 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-swift-storage-0\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.930861 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-svc\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.931140 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-sb\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:10 crc kubenswrapper[4475]: I1203 07:01:10.967399 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkjhh\" (UniqueName: \"kubernetes.io/projected/f5472efc-49e7-4b70-9084-da3a969617ab-kube-api-access-hkjhh\") pod \"dnsmasq-dns-664766cd5c-v6774\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.029400 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-combined-ca-bundle\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.029832 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4qjk\" (UniqueName: \"kubernetes.io/projected/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-kube-api-access-x4qjk\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.029886 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data-custom\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.029930 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.038230 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-combined-ca-bundle\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.045141 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data-custom\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.045996 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.051261 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4qjk\" (UniqueName: \"kubernetes.io/projected/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-kube-api-access-x4qjk\") pod \"heat-api-6c87685856-7spht\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.177073 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.193928 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.441877 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-74fdff459b-tj7xb"] Dec 03 07:01:11 crc kubenswrapper[4475]: W1203 07:01:11.443085 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47298c15_f6d2_463f_a97b_1b4d63999b81.slice/crio-c1ff6f2a7eab492e3539bc718bc309dd4305fcda6810a673ddf4d273bbc60c89 WatchSource:0}: Error finding container c1ff6f2a7eab492e3539bc718bc309dd4305fcda6810a673ddf4d273bbc60c89: Status 404 returned error can't find the container with id c1ff6f2a7eab492e3539bc718bc309dd4305fcda6810a673ddf4d273bbc60c89 Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.524427 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad1310ef-4548-4b61-8cdb-2a946873fb9c" path="/var/lib/kubelet/pods/ad1310ef-4548-4b61-8cdb-2a946873fb9c/volumes" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.524983 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c455da32-2255-4caa-8734-33f188430025" path="/var/lib/kubelet/pods/c455da32-2255-4caa-8734-33f188430025/volumes" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.569962 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58d4897ff7-tjzk8"] Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.641783 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-74fdff459b-tj7xb" event={"ID":"47298c15-f6d2-463f-a97b-1b4d63999b81","Type":"ContainerStarted","Data":"c1ff6f2a7eab492e3539bc718bc309dd4305fcda6810a673ddf4d273bbc60c89"} Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.676260 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.676506 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bc0a891f-83ae-4592-ac88-d4bc4359b4d9","Type":"ContainerStarted","Data":"7da5e356923a81689b5c43f701823cf75055bf64a6b2fc1a3eb4fd026cf61373"} Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.701764 4475 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ad1310ef-4548-4b61-8cdb-2a946873fb9c" podUID="bc0a891f-83ae-4592-ac88-d4bc4359b4d9" Dec 03 07:01:11 crc kubenswrapper[4475]: I1203 07:01:11.979833 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-664766cd5c-v6774"] Dec 03 07:01:12 crc kubenswrapper[4475]: I1203 07:01:12.051908 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6c87685856-7spht"] Dec 03 07:01:12 crc kubenswrapper[4475]: I1203 07:01:12.185750 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 07:01:12 crc kubenswrapper[4475]: I1203 07:01:12.686873 4475 generic.go:334] "Generic (PLEG): container finished" podID="f5472efc-49e7-4b70-9084-da3a969617ab" containerID="7e56b591603e9abe007ee9aab5276272c4c13159114ac185674366f8e64a902a" exitCode=0 Dec 03 07:01:12 crc kubenswrapper[4475]: I1203 07:01:12.686931 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664766cd5c-v6774" event={"ID":"f5472efc-49e7-4b70-9084-da3a969617ab","Type":"ContainerDied","Data":"7e56b591603e9abe007ee9aab5276272c4c13159114ac185674366f8e64a902a"} Dec 03 07:01:12 crc kubenswrapper[4475]: I1203 07:01:12.687145 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664766cd5c-v6774" event={"ID":"f5472efc-49e7-4b70-9084-da3a969617ab","Type":"ContainerStarted","Data":"dbeb829628985db70eb888fc8bf22ac76850d1122607e8e409c074ead2853120"} Dec 03 07:01:12 crc kubenswrapper[4475]: I1203 07:01:12.723090 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-74fdff459b-tj7xb" event={"ID":"47298c15-f6d2-463f-a97b-1b4d63999b81","Type":"ContainerStarted","Data":"0453b3dcee79a0359c57a479abaf3ac45f6a7778198ee6091769d53243ac05d2"} Dec 03 07:01:12 crc kubenswrapper[4475]: I1203 07:01:12.724174 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:12 crc kubenswrapper[4475]: I1203 07:01:12.733568 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" event={"ID":"6b5cf0ec-4d62-4de9-ad66-f758113443e7","Type":"ContainerStarted","Data":"f9481c5cb082cf4ab9451fb99c086ae8d7ffd1a17c5b5c87d7bdcc55946eb684"} Dec 03 07:01:12 crc kubenswrapper[4475]: I1203 07:01:12.734876 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c87685856-7spht" event={"ID":"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86","Type":"ContainerStarted","Data":"d0f46804a2408743f46d79fd37d217a81c23770403618e4ef4c9950af5e33b6a"} Dec 03 07:01:12 crc kubenswrapper[4475]: I1203 07:01:12.748558 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-74fdff459b-tj7xb" podStartSLOduration=2.748549299 podStartE2EDuration="2.748549299s" podCreationTimestamp="2025-12-03 07:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:12.7425904 +0000 UTC m=+957.547488735" watchObservedRunningTime="2025-12-03 07:01:12.748549299 +0000 UTC m=+957.553447633" Dec 03 07:01:13 crc kubenswrapper[4475]: I1203 07:01:13.744676 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664766cd5c-v6774" event={"ID":"f5472efc-49e7-4b70-9084-da3a969617ab","Type":"ContainerStarted","Data":"698e47c272c2b90493405777f693fd83df0b75195f388a0cc0e50059bbf82f9e"} Dec 03 07:01:13 crc kubenswrapper[4475]: I1203 07:01:13.762531 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-664766cd5c-v6774" podStartSLOduration=3.762517376 podStartE2EDuration="3.762517376s" podCreationTimestamp="2025-12-03 07:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:13.757051896 +0000 UTC m=+958.561950230" watchObservedRunningTime="2025-12-03 07:01:13.762517376 +0000 UTC m=+958.567415710" Dec 03 07:01:14 crc kubenswrapper[4475]: I1203 07:01:14.758782 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:14 crc kubenswrapper[4475]: I1203 07:01:14.824920 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fc4d79b88-s8hhg" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 03 07:01:15 crc kubenswrapper[4475]: I1203 07:01:15.774154 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" event={"ID":"6b5cf0ec-4d62-4de9-ad66-f758113443e7","Type":"ContainerStarted","Data":"ac7e14f9da3989dbfdc8572701d4cd54c6c606817f1b3bcd3ae56c896a5a8d03"} Dec 03 07:01:15 crc kubenswrapper[4475]: I1203 07:01:15.774324 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:15 crc kubenswrapper[4475]: I1203 07:01:15.777486 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c87685856-7spht" event={"ID":"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86","Type":"ContainerStarted","Data":"cc503027a80c1b9e5c723fde9fd83b6e889ebccf00d47eea3a2911427649fddb"} Dec 03 07:01:15 crc kubenswrapper[4475]: I1203 07:01:15.777567 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:15 crc kubenswrapper[4475]: I1203 07:01:15.794609 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" podStartSLOduration=2.387806657 podStartE2EDuration="5.79459425s" podCreationTimestamp="2025-12-03 07:01:10 +0000 UTC" firstStartedPulling="2025-12-03 07:01:11.633468625 +0000 UTC m=+956.438366959" lastFinishedPulling="2025-12-03 07:01:15.040256218 +0000 UTC m=+959.845154552" observedRunningTime="2025-12-03 07:01:15.78562559 +0000 UTC m=+960.590523924" watchObservedRunningTime="2025-12-03 07:01:15.79459425 +0000 UTC m=+960.599492584" Dec 03 07:01:15 crc kubenswrapper[4475]: I1203 07:01:15.811916 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6c87685856-7spht" podStartSLOduration=2.833623858 podStartE2EDuration="5.811903509s" podCreationTimestamp="2025-12-03 07:01:10 +0000 UTC" firstStartedPulling="2025-12-03 07:01:12.064649484 +0000 UTC m=+956.869547819" lastFinishedPulling="2025-12-03 07:01:15.042929136 +0000 UTC m=+959.847827470" observedRunningTime="2025-12-03 07:01:15.809695146 +0000 UTC m=+960.614593470" watchObservedRunningTime="2025-12-03 07:01:15.811903509 +0000 UTC m=+960.616801844" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.696323 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-65549d6dc8-kp5n5"] Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.697632 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.714667 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-74bbb49977-5dq74"] Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.715592 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.720568 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7d7bff4588-wzmxn"] Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.722241 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.741204 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqs7s\" (UniqueName: \"kubernetes.io/projected/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-kube-api-access-zqs7s\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.741265 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdv2b\" (UniqueName: \"kubernetes.io/projected/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-kube-api-access-cdv2b\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.741289 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.741317 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-config-data-custom\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.741344 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data-custom\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.741374 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-combined-ca-bundle\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.741390 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-combined-ca-bundle\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.741415 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-config-data\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.752383 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-74bbb49977-5dq74"] Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.759752 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7d7bff4588-wzmxn"] Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.809162 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-65549d6dc8-kp5n5"] Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.843506 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.843560 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-combined-ca-bundle\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.843595 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdv2b\" (UniqueName: \"kubernetes.io/projected/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-kube-api-access-cdv2b\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.843617 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhg57\" (UniqueName: \"kubernetes.io/projected/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-kube-api-access-dhg57\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.843636 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.843669 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-config-data-custom\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.843687 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data-custom\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.843738 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data-custom\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.843880 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-combined-ca-bundle\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.843901 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-combined-ca-bundle\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.843948 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-config-data\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.844005 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqs7s\" (UniqueName: \"kubernetes.io/projected/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-kube-api-access-zqs7s\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.850051 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-config-data-custom\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.851030 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.853728 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-config-data\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.854745 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data-custom\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.858274 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-combined-ca-bundle\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.860609 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqs7s\" (UniqueName: \"kubernetes.io/projected/09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43-kube-api-access-zqs7s\") pod \"heat-engine-65549d6dc8-kp5n5\" (UID: \"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43\") " pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.862779 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-combined-ca-bundle\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.864832 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdv2b\" (UniqueName: \"kubernetes.io/projected/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-kube-api-access-cdv2b\") pod \"heat-api-74bbb49977-5dq74\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.944853 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.944905 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-combined-ca-bundle\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.944937 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhg57\" (UniqueName: \"kubernetes.io/projected/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-kube-api-access-dhg57\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.944972 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data-custom\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.962126 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data-custom\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.962164 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.962430 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhg57\" (UniqueName: \"kubernetes.io/projected/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-kube-api-access-dhg57\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:16 crc kubenswrapper[4475]: I1203 07:01:16.965350 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-combined-ca-bundle\") pod \"heat-cfnapi-7d7bff4588-wzmxn\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:17 crc kubenswrapper[4475]: I1203 07:01:17.013482 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:17 crc kubenswrapper[4475]: I1203 07:01:17.032759 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:17 crc kubenswrapper[4475]: I1203 07:01:17.040469 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:17 crc kubenswrapper[4475]: I1203 07:01:17.783558 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-65549d6dc8-kp5n5"] Dec 03 07:01:17 crc kubenswrapper[4475]: I1203 07:01:17.836581 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-65549d6dc8-kp5n5" event={"ID":"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43","Type":"ContainerStarted","Data":"61b45a0d87d5f7613a2bc99c3f682e4982ca06a000aca00ca6606b3a9f0f6111"} Dec 03 07:01:17 crc kubenswrapper[4475]: I1203 07:01:17.914527 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-74bbb49977-5dq74"] Dec 03 07:01:17 crc kubenswrapper[4475]: I1203 07:01:17.921513 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7d7bff4588-wzmxn"] Dec 03 07:01:17 crc kubenswrapper[4475]: I1203 07:01:17.931963 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 07:01:17 crc kubenswrapper[4475]: W1203 07:01:17.943318 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88fc5e9a_b3f3_421c_9b93_939e11a9f81f.slice/crio-3410391fa4ebdc50afa0c5a122db210fe0a99a2fd7d39766e2a76ac7b1b781f8 WatchSource:0}: Error finding container 3410391fa4ebdc50afa0c5a122db210fe0a99a2fd7d39766e2a76ac7b1b781f8: Status 404 returned error can't find the container with id 3410391fa4ebdc50afa0c5a122db210fe0a99a2fd7d39766e2a76ac7b1b781f8 Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.862963 4475 generic.go:334] "Generic (PLEG): container finished" podID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" containerID="c529f313f3e60c39a120a6bae259542f17bdfd9cfea7343f6afff36f2d135317" exitCode=1 Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.863214 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-74bbb49977-5dq74" event={"ID":"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d","Type":"ContainerDied","Data":"c529f313f3e60c39a120a6bae259542f17bdfd9cfea7343f6afff36f2d135317"} Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.863241 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-74bbb49977-5dq74" event={"ID":"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d","Type":"ContainerStarted","Data":"c76c2c59dd30460e463bfa5c684554f29958402a2793e148c47355f7f48c801d"} Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.863670 4475 scope.go:117] "RemoveContainer" containerID="c529f313f3e60c39a120a6bae259542f17bdfd9cfea7343f6afff36f2d135317" Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.865160 4475 generic.go:334] "Generic (PLEG): container finished" podID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" containerID="98f32a0f5e7db47ee82aa63b9fc5094dc95aee4de0b651d226733b7b866440da" exitCode=1 Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.865209 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" event={"ID":"88fc5e9a-b3f3-421c-9b93-939e11a9f81f","Type":"ContainerDied","Data":"98f32a0f5e7db47ee82aa63b9fc5094dc95aee4de0b651d226733b7b866440da"} Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.865223 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" event={"ID":"88fc5e9a-b3f3-421c-9b93-939e11a9f81f","Type":"ContainerStarted","Data":"3410391fa4ebdc50afa0c5a122db210fe0a99a2fd7d39766e2a76ac7b1b781f8"} Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.865811 4475 scope.go:117] "RemoveContainer" containerID="98f32a0f5e7db47ee82aa63b9fc5094dc95aee4de0b651d226733b7b866440da" Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.867362 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-65549d6dc8-kp5n5" event={"ID":"09f28e9b-cb6e-41f5-b0fe-e32cd1d7fb43","Type":"ContainerStarted","Data":"3543926b01f36fb4a3abb8cca2d47dfb8c8cb3ddfbc35f07114a244e2b50a840"} Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.867598 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.869070 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.869338 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="ceilometer-central-agent" containerID="cri-o://314eebb9f24bd7ca6ede27d5040984fbf03c937af31d92e1f4dfa30c219712b4" gracePeriod=30 Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.869472 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="sg-core" containerID="cri-o://48b539995f961f39f0f3784b46e123e5caf6c9bfe6e2f1621226b6e95eb4a268" gracePeriod=30 Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.869483 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="proxy-httpd" containerID="cri-o://14c25df4b313807582231974fd3e2f0724e4e3d96f0797158c2d85787b388154" gracePeriod=30 Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.869492 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="ceilometer-notification-agent" containerID="cri-o://74a1bf735d06e69f49a7a5836c0ac147e44657c1057d939890bcb84df13ddf46" gracePeriod=30 Dec 03 07:01:18 crc kubenswrapper[4475]: I1203 07:01:18.927632 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-65549d6dc8-kp5n5" podStartSLOduration=2.927603976 podStartE2EDuration="2.927603976s" podCreationTimestamp="2025-12-03 07:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:18.919246205 +0000 UTC m=+963.724144539" watchObservedRunningTime="2025-12-03 07:01:18.927603976 +0000 UTC m=+963.732502310" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.047791 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58d4897ff7-tjzk8"] Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.048156 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" podUID="6b5cf0ec-4d62-4de9-ad66-f758113443e7" containerName="heat-cfnapi" containerID="cri-o://ac7e14f9da3989dbfdc8572701d4cd54c6c606817f1b3bcd3ae56c896a5a8d03" gracePeriod=60 Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.057684 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6c87685856-7spht"] Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.057851 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-6c87685856-7spht" podUID="c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" containerName="heat-api" containerID="cri-o://cc503027a80c1b9e5c723fde9fd83b6e889ebccf00d47eea3a2911427649fddb" gracePeriod=60 Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.161736 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-8465d48f48-kttgq"] Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.163707 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.168652 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.170678 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.203883 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5869676b4b-mfgq2"] Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.205267 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.210277 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.210968 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.215047 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-8465d48f48-kttgq"] Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.223481 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5869676b4b-mfgq2"] Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.283660 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-88869455c-74p7r"] Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.285294 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.291787 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.292069 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.292275 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.297599 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-88869455c-74p7r"] Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.311856 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-config-data-custom\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.311904 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-combined-ca-bundle\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.312040 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-combined-ca-bundle\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.312061 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6dvn\" (UniqueName: \"kubernetes.io/projected/21c23182-6d85-4951-b96a-072f0d04bae6-kube-api-access-w6dvn\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.312079 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-config-data-custom\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.312133 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-config-data\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.312149 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-public-tls-certs\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.312170 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-internal-tls-certs\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.312195 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-config-data\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.312215 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-public-tls-certs\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.312248 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-internal-tls-certs\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.312285 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjdzt\" (UniqueName: \"kubernetes.io/projected/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-kube-api-access-zjdzt\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414038 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-internal-tls-certs\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414092 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-config-data-custom\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414121 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-combined-ca-bundle\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414145 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dc7v\" (UniqueName: \"kubernetes.io/projected/16bbbd37-150f-4b54-8fc1-eb7708ecca88-kube-api-access-8dc7v\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414174 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-public-tls-certs\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414203 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-combined-ca-bundle\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414222 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/16bbbd37-150f-4b54-8fc1-eb7708ecca88-etc-swift\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414251 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-combined-ca-bundle\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414267 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6dvn\" (UniqueName: \"kubernetes.io/projected/21c23182-6d85-4951-b96a-072f0d04bae6-kube-api-access-w6dvn\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414280 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-config-data-custom\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414313 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-config-data\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414327 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16bbbd37-150f-4b54-8fc1-eb7708ecca88-run-httpd\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414343 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-config-data\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414357 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-public-tls-certs\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414374 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-internal-tls-certs\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414393 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-config-data\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414409 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-public-tls-certs\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414428 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16bbbd37-150f-4b54-8fc1-eb7708ecca88-log-httpd\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414503 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-internal-tls-certs\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.414593 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjdzt\" (UniqueName: \"kubernetes.io/projected/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-kube-api-access-zjdzt\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.422352 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-public-tls-certs\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.422672 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-config-data-custom\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.424013 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-internal-tls-certs\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.424915 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-config-data\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.427865 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-config-data-custom\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.428727 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-public-tls-certs\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.428867 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-combined-ca-bundle\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.432328 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-config-data\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.433731 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-combined-ca-bundle\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.443935 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c23182-6d85-4951-b96a-072f0d04bae6-internal-tls-certs\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.451097 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6dvn\" (UniqueName: \"kubernetes.io/projected/21c23182-6d85-4951-b96a-072f0d04bae6-kube-api-access-w6dvn\") pod \"heat-api-5869676b4b-mfgq2\" (UID: \"21c23182-6d85-4951-b96a-072f0d04bae6\") " pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.473064 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjdzt\" (UniqueName: \"kubernetes.io/projected/89ecee9c-2a17-4d2b-bd66-78a1542d0f57-kube-api-access-zjdzt\") pod \"heat-cfnapi-8465d48f48-kttgq\" (UID: \"89ecee9c-2a17-4d2b-bd66-78a1542d0f57\") " pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.494548 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.519792 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-public-tls-certs\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.519846 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-combined-ca-bundle\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.519875 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/16bbbd37-150f-4b54-8fc1-eb7708ecca88-etc-swift\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.519937 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-config-data\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.519952 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16bbbd37-150f-4b54-8fc1-eb7708ecca88-run-httpd\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.519993 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16bbbd37-150f-4b54-8fc1-eb7708ecca88-log-httpd\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.520061 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-internal-tls-certs\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.520446 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16bbbd37-150f-4b54-8fc1-eb7708ecca88-run-httpd\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.520672 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16bbbd37-150f-4b54-8fc1-eb7708ecca88-log-httpd\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.524168 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/16bbbd37-150f-4b54-8fc1-eb7708ecca88-etc-swift\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.525038 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dc7v\" (UniqueName: \"kubernetes.io/projected/16bbbd37-150f-4b54-8fc1-eb7708ecca88-kube-api-access-8dc7v\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.526173 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-public-tls-certs\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.529444 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-combined-ca-bundle\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.534469 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.535203 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-internal-tls-certs\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.542072 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16bbbd37-150f-4b54-8fc1-eb7708ecca88-config-data\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.542836 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dc7v\" (UniqueName: \"kubernetes.io/projected/16bbbd37-150f-4b54-8fc1-eb7708ecca88-kube-api-access-8dc7v\") pod \"swift-proxy-88869455c-74p7r\" (UID: \"16bbbd37-150f-4b54-8fc1-eb7708ecca88\") " pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.603870 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.877201 4475 generic.go:334] "Generic (PLEG): container finished" podID="c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" containerID="cc503027a80c1b9e5c723fde9fd83b6e889ebccf00d47eea3a2911427649fddb" exitCode=0 Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.877301 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c87685856-7spht" event={"ID":"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86","Type":"ContainerDied","Data":"cc503027a80c1b9e5c723fde9fd83b6e889ebccf00d47eea3a2911427649fddb"} Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.880944 4475 generic.go:334] "Generic (PLEG): container finished" podID="13ed03b8-8758-4c25-b37f-2793697026d2" containerID="14c25df4b313807582231974fd3e2f0724e4e3d96f0797158c2d85787b388154" exitCode=0 Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.880968 4475 generic.go:334] "Generic (PLEG): container finished" podID="13ed03b8-8758-4c25-b37f-2793697026d2" containerID="48b539995f961f39f0f3784b46e123e5caf6c9bfe6e2f1621226b6e95eb4a268" exitCode=2 Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.880976 4475 generic.go:334] "Generic (PLEG): container finished" podID="13ed03b8-8758-4c25-b37f-2793697026d2" containerID="314eebb9f24bd7ca6ede27d5040984fbf03c937af31d92e1f4dfa30c219712b4" exitCode=0 Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.881027 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13ed03b8-8758-4c25-b37f-2793697026d2","Type":"ContainerDied","Data":"14c25df4b313807582231974fd3e2f0724e4e3d96f0797158c2d85787b388154"} Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.881051 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13ed03b8-8758-4c25-b37f-2793697026d2","Type":"ContainerDied","Data":"48b539995f961f39f0f3784b46e123e5caf6c9bfe6e2f1621226b6e95eb4a268"} Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.881061 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13ed03b8-8758-4c25-b37f-2793697026d2","Type":"ContainerDied","Data":"314eebb9f24bd7ca6ede27d5040984fbf03c937af31d92e1f4dfa30c219712b4"} Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.882448 4475 generic.go:334] "Generic (PLEG): container finished" podID="6b5cf0ec-4d62-4de9-ad66-f758113443e7" containerID="ac7e14f9da3989dbfdc8572701d4cd54c6c606817f1b3bcd3ae56c896a5a8d03" exitCode=0 Dec 03 07:01:19 crc kubenswrapper[4475]: I1203 07:01:19.882590 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" event={"ID":"6b5cf0ec-4d62-4de9-ad66-f758113443e7","Type":"ContainerDied","Data":"ac7e14f9da3989dbfdc8572701d4cd54c6c606817f1b3bcd3ae56c896a5a8d03"} Dec 03 07:01:20 crc kubenswrapper[4475]: I1203 07:01:20.925845 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" podUID="6b5cf0ec-4d62-4de9-ad66-f758113443e7" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.173:8000/healthcheck\": dial tcp 10.217.0.173:8000: connect: connection refused" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.179588 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.195272 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-6c87685856-7spht" podUID="c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.175:8004/healthcheck\": dial tcp 10.217.0.175:8004: connect: connection refused" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.244302 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5967d8988f-ghdmf"] Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.244525 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" podUID="00f99628-c2ec-48c9-b266-0f14cbf05570" containerName="dnsmasq-dns" containerID="cri-o://861b74ee61dba1c1ca73b38d2d84a136f245dfb1fc4a5895e8a19271c4a9939a" gracePeriod=10 Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.819791 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-l6t9x"] Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.821076 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l6t9x" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.828230 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-l6t9x"] Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.866169 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5b7dc01-eee2-4b3a-b834-3092016603d3-operator-scripts\") pod \"nova-api-db-create-l6t9x\" (UID: \"d5b7dc01-eee2-4b3a-b834-3092016603d3\") " pod="openstack/nova-api-db-create-l6t9x" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.866396 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4g88\" (UniqueName: \"kubernetes.io/projected/d5b7dc01-eee2-4b3a-b834-3092016603d3-kube-api-access-j4g88\") pod \"nova-api-db-create-l6t9x\" (UID: \"d5b7dc01-eee2-4b3a-b834-3092016603d3\") " pod="openstack/nova-api-db-create-l6t9x" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.927730 4475 generic.go:334] "Generic (PLEG): container finished" podID="00f99628-c2ec-48c9-b266-0f14cbf05570" containerID="861b74ee61dba1c1ca73b38d2d84a136f245dfb1fc4a5895e8a19271c4a9939a" exitCode=0 Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.927849 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" event={"ID":"00f99628-c2ec-48c9-b266-0f14cbf05570","Type":"ContainerDied","Data":"861b74ee61dba1c1ca73b38d2d84a136f245dfb1fc4a5895e8a19271c4a9939a"} Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.927881 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-7af4-account-create-update-mpdn4"] Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.932610 4475 generic.go:334] "Generic (PLEG): container finished" podID="ffaaa182-c947-4cbd-b96d-378fae973360" containerID="a342851e48b4f81a5ac7652402e4a05cdd8410849baf25d75ddc6e72b7854358" exitCode=137 Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.953799 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffaaa182-c947-4cbd-b96d-378fae973360","Type":"ContainerDied","Data":"a342851e48b4f81a5ac7652402e4a05cdd8410849baf25d75ddc6e72b7854358"} Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.953909 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7af4-account-create-update-mpdn4" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.965623 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.968850 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5b7dc01-eee2-4b3a-b834-3092016603d3-operator-scripts\") pod \"nova-api-db-create-l6t9x\" (UID: \"d5b7dc01-eee2-4b3a-b834-3092016603d3\") " pod="openstack/nova-api-db-create-l6t9x" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.969006 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7crv\" (UniqueName: \"kubernetes.io/projected/37cea503-9fad-48b6-9ec0-b8957e5420f1-kube-api-access-c7crv\") pod \"nova-api-7af4-account-create-update-mpdn4\" (UID: \"37cea503-9fad-48b6-9ec0-b8957e5420f1\") " pod="openstack/nova-api-7af4-account-create-update-mpdn4" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.969175 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37cea503-9fad-48b6-9ec0-b8957e5420f1-operator-scripts\") pod \"nova-api-7af4-account-create-update-mpdn4\" (UID: \"37cea503-9fad-48b6-9ec0-b8957e5420f1\") " pod="openstack/nova-api-7af4-account-create-update-mpdn4" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.969344 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4g88\" (UniqueName: \"kubernetes.io/projected/d5b7dc01-eee2-4b3a-b834-3092016603d3-kube-api-access-j4g88\") pod \"nova-api-db-create-l6t9x\" (UID: \"d5b7dc01-eee2-4b3a-b834-3092016603d3\") " pod="openstack/nova-api-db-create-l6t9x" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.983633 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5b7dc01-eee2-4b3a-b834-3092016603d3-operator-scripts\") pod \"nova-api-db-create-l6t9x\" (UID: \"d5b7dc01-eee2-4b3a-b834-3092016603d3\") " pod="openstack/nova-api-db-create-l6t9x" Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.984582 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-t8lwx"] Dec 03 07:01:21 crc kubenswrapper[4475]: I1203 07:01:21.985783 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t8lwx" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.019103 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7af4-account-create-update-mpdn4"] Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.019741 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4g88\" (UniqueName: \"kubernetes.io/projected/d5b7dc01-eee2-4b3a-b834-3092016603d3-kube-api-access-j4g88\") pod \"nova-api-db-create-l6t9x\" (UID: \"d5b7dc01-eee2-4b3a-b834-3092016603d3\") " pod="openstack/nova-api-db-create-l6t9x" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.027933 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-t8lwx"] Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.032831 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.032867 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.041545 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tbtn5"] Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.042723 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.042744 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.042814 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbtn5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.047172 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tbtn5"] Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.073580 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece011ea-6da3-49ca-8dd8-b014f2796157-operator-scripts\") pod \"nova-cell1-db-create-tbtn5\" (UID: \"ece011ea-6da3-49ca-8dd8-b014f2796157\") " pod="openstack/nova-cell1-db-create-tbtn5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.073689 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qgh4\" (UniqueName: \"kubernetes.io/projected/fae26675-9566-4902-af26-0247c3f6164b-kube-api-access-8qgh4\") pod \"nova-cell0-db-create-t8lwx\" (UID: \"fae26675-9566-4902-af26-0247c3f6164b\") " pod="openstack/nova-cell0-db-create-t8lwx" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.073974 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7crv\" (UniqueName: \"kubernetes.io/projected/37cea503-9fad-48b6-9ec0-b8957e5420f1-kube-api-access-c7crv\") pod \"nova-api-7af4-account-create-update-mpdn4\" (UID: \"37cea503-9fad-48b6-9ec0-b8957e5420f1\") " pod="openstack/nova-api-7af4-account-create-update-mpdn4" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.074006 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fae26675-9566-4902-af26-0247c3f6164b-operator-scripts\") pod \"nova-cell0-db-create-t8lwx\" (UID: \"fae26675-9566-4902-af26-0247c3f6164b\") " pod="openstack/nova-cell0-db-create-t8lwx" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.074037 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr5pw\" (UniqueName: \"kubernetes.io/projected/ece011ea-6da3-49ca-8dd8-b014f2796157-kube-api-access-gr5pw\") pod \"nova-cell1-db-create-tbtn5\" (UID: \"ece011ea-6da3-49ca-8dd8-b014f2796157\") " pod="openstack/nova-cell1-db-create-tbtn5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.074064 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37cea503-9fad-48b6-9ec0-b8957e5420f1-operator-scripts\") pod \"nova-api-7af4-account-create-update-mpdn4\" (UID: \"37cea503-9fad-48b6-9ec0-b8957e5420f1\") " pod="openstack/nova-api-7af4-account-create-update-mpdn4" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.075972 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37cea503-9fad-48b6-9ec0-b8957e5420f1-operator-scripts\") pod \"nova-api-7af4-account-create-update-mpdn4\" (UID: \"37cea503-9fad-48b6-9ec0-b8957e5420f1\") " pod="openstack/nova-api-7af4-account-create-update-mpdn4" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.095407 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7crv\" (UniqueName: \"kubernetes.io/projected/37cea503-9fad-48b6-9ec0-b8957e5420f1-kube-api-access-c7crv\") pod \"nova-api-7af4-account-create-update-mpdn4\" (UID: \"37cea503-9fad-48b6-9ec0-b8957e5420f1\") " pod="openstack/nova-api-7af4-account-create-update-mpdn4" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.141149 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-6ad2-account-create-update-phxm5"] Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.142484 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.143467 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l6t9x" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.151506 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6ad2-account-create-update-phxm5"] Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.163937 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.176516 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad132bf-e35e-48e6-b406-0aaca969a684-operator-scripts\") pod \"nova-cell0-6ad2-account-create-update-phxm5\" (UID: \"4ad132bf-e35e-48e6-b406-0aaca969a684\") " pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.176612 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fae26675-9566-4902-af26-0247c3f6164b-operator-scripts\") pod \"nova-cell0-db-create-t8lwx\" (UID: \"fae26675-9566-4902-af26-0247c3f6164b\") " pod="openstack/nova-cell0-db-create-t8lwx" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.176668 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr5pw\" (UniqueName: \"kubernetes.io/projected/ece011ea-6da3-49ca-8dd8-b014f2796157-kube-api-access-gr5pw\") pod \"nova-cell1-db-create-tbtn5\" (UID: \"ece011ea-6da3-49ca-8dd8-b014f2796157\") " pod="openstack/nova-cell1-db-create-tbtn5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.176740 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece011ea-6da3-49ca-8dd8-b014f2796157-operator-scripts\") pod \"nova-cell1-db-create-tbtn5\" (UID: \"ece011ea-6da3-49ca-8dd8-b014f2796157\") " pod="openstack/nova-cell1-db-create-tbtn5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.176775 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdzkt\" (UniqueName: \"kubernetes.io/projected/4ad132bf-e35e-48e6-b406-0aaca969a684-kube-api-access-fdzkt\") pod \"nova-cell0-6ad2-account-create-update-phxm5\" (UID: \"4ad132bf-e35e-48e6-b406-0aaca969a684\") " pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.176857 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qgh4\" (UniqueName: \"kubernetes.io/projected/fae26675-9566-4902-af26-0247c3f6164b-kube-api-access-8qgh4\") pod \"nova-cell0-db-create-t8lwx\" (UID: \"fae26675-9566-4902-af26-0247c3f6164b\") " pod="openstack/nova-cell0-db-create-t8lwx" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.178524 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fae26675-9566-4902-af26-0247c3f6164b-operator-scripts\") pod \"nova-cell0-db-create-t8lwx\" (UID: \"fae26675-9566-4902-af26-0247c3f6164b\") " pod="openstack/nova-cell0-db-create-t8lwx" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.178680 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece011ea-6da3-49ca-8dd8-b014f2796157-operator-scripts\") pod \"nova-cell1-db-create-tbtn5\" (UID: \"ece011ea-6da3-49ca-8dd8-b014f2796157\") " pod="openstack/nova-cell1-db-create-tbtn5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.196894 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" podUID="00f99628-c2ec-48c9-b266-0f14cbf05570" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.164:5353: connect: connection refused" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.200849 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr5pw\" (UniqueName: \"kubernetes.io/projected/ece011ea-6da3-49ca-8dd8-b014f2796157-kube-api-access-gr5pw\") pod \"nova-cell1-db-create-tbtn5\" (UID: \"ece011ea-6da3-49ca-8dd8-b014f2796157\") " pod="openstack/nova-cell1-db-create-tbtn5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.209330 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qgh4\" (UniqueName: \"kubernetes.io/projected/fae26675-9566-4902-af26-0247c3f6164b-kube-api-access-8qgh4\") pod \"nova-cell0-db-create-t8lwx\" (UID: \"fae26675-9566-4902-af26-0247c3f6164b\") " pod="openstack/nova-cell0-db-create-t8lwx" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.279402 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdzkt\" (UniqueName: \"kubernetes.io/projected/4ad132bf-e35e-48e6-b406-0aaca969a684-kube-api-access-fdzkt\") pod \"nova-cell0-6ad2-account-create-update-phxm5\" (UID: \"4ad132bf-e35e-48e6-b406-0aaca969a684\") " pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.279737 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad132bf-e35e-48e6-b406-0aaca969a684-operator-scripts\") pod \"nova-cell0-6ad2-account-create-update-phxm5\" (UID: \"4ad132bf-e35e-48e6-b406-0aaca969a684\") " pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.280360 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad132bf-e35e-48e6-b406-0aaca969a684-operator-scripts\") pod \"nova-cell0-6ad2-account-create-update-phxm5\" (UID: \"4ad132bf-e35e-48e6-b406-0aaca969a684\") " pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.311129 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4b91-account-create-update-rv6lt"] Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.313746 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdzkt\" (UniqueName: \"kubernetes.io/projected/4ad132bf-e35e-48e6-b406-0aaca969a684-kube-api-access-fdzkt\") pod \"nova-cell0-6ad2-account-create-update-phxm5\" (UID: \"4ad132bf-e35e-48e6-b406-0aaca969a684\") " pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.319631 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7af4-account-create-update-mpdn4" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.320955 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.322547 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.329600 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4b91-account-create-update-rv6lt"] Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.381126 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t8lwx" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.383282 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/161d281f-1845-400c-bbda-691d6681cc69-operator-scripts\") pod \"nova-cell1-4b91-account-create-update-rv6lt\" (UID: \"161d281f-1845-400c-bbda-691d6681cc69\") " pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.383565 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bt9b\" (UniqueName: \"kubernetes.io/projected/161d281f-1845-400c-bbda-691d6681cc69-kube-api-access-8bt9b\") pod \"nova-cell1-4b91-account-create-update-rv6lt\" (UID: \"161d281f-1845-400c-bbda-691d6681cc69\") " pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.390358 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbtn5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.479229 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.484750 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/161d281f-1845-400c-bbda-691d6681cc69-operator-scripts\") pod \"nova-cell1-4b91-account-create-update-rv6lt\" (UID: \"161d281f-1845-400c-bbda-691d6681cc69\") " pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.484866 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt9b\" (UniqueName: \"kubernetes.io/projected/161d281f-1845-400c-bbda-691d6681cc69-kube-api-access-8bt9b\") pod \"nova-cell1-4b91-account-create-update-rv6lt\" (UID: \"161d281f-1845-400c-bbda-691d6681cc69\") " pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.486726 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/161d281f-1845-400c-bbda-691d6681cc69-operator-scripts\") pod \"nova-cell1-4b91-account-create-update-rv6lt\" (UID: \"161d281f-1845-400c-bbda-691d6681cc69\") " pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.500434 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bt9b\" (UniqueName: \"kubernetes.io/projected/161d281f-1845-400c-bbda-691d6681cc69-kube-api-access-8bt9b\") pod \"nova-cell1-4b91-account-create-update-rv6lt\" (UID: \"161d281f-1845-400c-bbda-691d6681cc69\") " pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.547180 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="ffaaa182-c947-4cbd-b96d-378fae973360" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.165:8776/healthcheck\": dial tcp 10.217.0.165:8776: connect: connection refused" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.661160 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.943502 4475 generic.go:334] "Generic (PLEG): container finished" podID="13ed03b8-8758-4c25-b37f-2793697026d2" containerID="74a1bf735d06e69f49a7a5836c0ac147e44657c1057d939890bcb84df13ddf46" exitCode=0 Dec 03 07:01:22 crc kubenswrapper[4475]: I1203 07:01:22.943576 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13ed03b8-8758-4c25-b37f-2793697026d2","Type":"ContainerDied","Data":"74a1bf735d06e69f49a7a5836c0ac147e44657c1057d939890bcb84df13ddf46"} Dec 03 07:01:24 crc kubenswrapper[4475]: I1203 07:01:24.825013 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fc4d79b88-s8hhg" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 03 07:01:24 crc kubenswrapper[4475]: I1203 07:01:24.825328 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.691145 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.753506 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5dzd\" (UniqueName: \"kubernetes.io/projected/00f99628-c2ec-48c9-b266-0f14cbf05570-kube-api-access-t5dzd\") pod \"00f99628-c2ec-48c9-b266-0f14cbf05570\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.753806 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-sb\") pod \"00f99628-c2ec-48c9-b266-0f14cbf05570\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.753861 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-swift-storage-0\") pod \"00f99628-c2ec-48c9-b266-0f14cbf05570\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.753896 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-svc\") pod \"00f99628-c2ec-48c9-b266-0f14cbf05570\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.753952 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-nb\") pod \"00f99628-c2ec-48c9-b266-0f14cbf05570\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.754006 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-config\") pod \"00f99628-c2ec-48c9-b266-0f14cbf05570\" (UID: \"00f99628-c2ec-48c9-b266-0f14cbf05570\") " Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.763201 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00f99628-c2ec-48c9-b266-0f14cbf05570-kube-api-access-t5dzd" (OuterVolumeSpecName: "kube-api-access-t5dzd") pod "00f99628-c2ec-48c9-b266-0f14cbf05570" (UID: "00f99628-c2ec-48c9-b266-0f14cbf05570"). InnerVolumeSpecName "kube-api-access-t5dzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.856604 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5dzd\" (UniqueName: \"kubernetes.io/projected/00f99628-c2ec-48c9-b266-0f14cbf05570-kube-api-access-t5dzd\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.955242 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "00f99628-c2ec-48c9-b266-0f14cbf05570" (UID: "00f99628-c2ec-48c9-b266-0f14cbf05570"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.959859 4475 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.971982 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00f99628-c2ec-48c9-b266-0f14cbf05570" (UID: "00f99628-c2ec-48c9-b266-0f14cbf05570"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.978239 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" event={"ID":"00f99628-c2ec-48c9-b266-0f14cbf05570","Type":"ContainerDied","Data":"86923467babacbdac85b8413f4adc00e5627cf676ead74fad9c3c535839a95ac"} Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.978366 4475 scope.go:117] "RemoveContainer" containerID="861b74ee61dba1c1ca73b38d2d84a136f245dfb1fc4a5895e8a19271c4a9939a" Dec 03 07:01:25 crc kubenswrapper[4475]: I1203 07:01:25.978586 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5967d8988f-ghdmf" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.004564 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00f99628-c2ec-48c9-b266-0f14cbf05570" (UID: "00f99628-c2ec-48c9-b266-0f14cbf05570"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.020106 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00f99628-c2ec-48c9-b266-0f14cbf05570" (UID: "00f99628-c2ec-48c9-b266-0f14cbf05570"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.069872 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.083531 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.101905 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-config" (OuterVolumeSpecName: "config") pod "00f99628-c2ec-48c9-b266-0f14cbf05570" (UID: "00f99628-c2ec-48c9-b266-0f14cbf05570"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.102842 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.107301 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.107434 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.107746 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f99628-c2ec-48c9-b266-0f14cbf05570-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.193177 4475 scope.go:117] "RemoveContainer" containerID="6ee0f40e8dd95142d94825e51f954cf267236b294a7d43db3f3810d5c60bdc83" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216131 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-combined-ca-bundle\") pod \"ffaaa182-c947-4cbd-b96d-378fae973360\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216246 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data\") pod \"ffaaa182-c947-4cbd-b96d-378fae973360\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216271 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffaaa182-c947-4cbd-b96d-378fae973360-logs\") pod \"ffaaa182-c947-4cbd-b96d-378fae973360\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216329 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data-custom\") pod \"ffaaa182-c947-4cbd-b96d-378fae973360\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216355 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data-custom\") pod \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216380 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ffaaa182-c947-4cbd-b96d-378fae973360-etc-machine-id\") pod \"ffaaa182-c947-4cbd-b96d-378fae973360\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216396 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5wdr\" (UniqueName: \"kubernetes.io/projected/6b5cf0ec-4d62-4de9-ad66-f758113443e7-kube-api-access-j5wdr\") pod \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216431 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqqkk\" (UniqueName: \"kubernetes.io/projected/ffaaa182-c947-4cbd-b96d-378fae973360-kube-api-access-sqqkk\") pod \"ffaaa182-c947-4cbd-b96d-378fae973360\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216476 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-scripts\") pod \"ffaaa182-c947-4cbd-b96d-378fae973360\" (UID: \"ffaaa182-c947-4cbd-b96d-378fae973360\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216495 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-combined-ca-bundle\") pod \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216511 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data\") pod \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\" (UID: \"6b5cf0ec-4d62-4de9-ad66-f758113443e7\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.216769 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffaaa182-c947-4cbd-b96d-378fae973360-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ffaaa182-c947-4cbd-b96d-378fae973360" (UID: "ffaaa182-c947-4cbd-b96d-378fae973360"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.219849 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffaaa182-c947-4cbd-b96d-378fae973360-logs" (OuterVolumeSpecName: "logs") pod "ffaaa182-c947-4cbd-b96d-378fae973360" (UID: "ffaaa182-c947-4cbd-b96d-378fae973360"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.228224 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffaaa182-c947-4cbd-b96d-378fae973360-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.228245 4475 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ffaaa182-c947-4cbd-b96d-378fae973360-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.242515 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6b5cf0ec-4d62-4de9-ad66-f758113443e7" (UID: "6b5cf0ec-4d62-4de9-ad66-f758113443e7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.243183 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-scripts" (OuterVolumeSpecName: "scripts") pod "ffaaa182-c947-4cbd-b96d-378fae973360" (UID: "ffaaa182-c947-4cbd-b96d-378fae973360"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.243316 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b5cf0ec-4d62-4de9-ad66-f758113443e7-kube-api-access-j5wdr" (OuterVolumeSpecName: "kube-api-access-j5wdr") pod "6b5cf0ec-4d62-4de9-ad66-f758113443e7" (UID: "6b5cf0ec-4d62-4de9-ad66-f758113443e7"). InnerVolumeSpecName "kube-api-access-j5wdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.261219 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ffaaa182-c947-4cbd-b96d-378fae973360" (UID: "ffaaa182-c947-4cbd-b96d-378fae973360"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.282895 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.283414 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.285086 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffaaa182-c947-4cbd-b96d-378fae973360-kube-api-access-sqqkk" (OuterVolumeSpecName: "kube-api-access-sqqkk") pod "ffaaa182-c947-4cbd-b96d-378fae973360" (UID: "ffaaa182-c947-4cbd-b96d-378fae973360"). InnerVolumeSpecName "kube-api-access-sqqkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.333875 4475 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.333901 4475 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.333911 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5wdr\" (UniqueName: \"kubernetes.io/projected/6b5cf0ec-4d62-4de9-ad66-f758113443e7-kube-api-access-j5wdr\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.333921 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqqkk\" (UniqueName: \"kubernetes.io/projected/ffaaa182-c947-4cbd-b96d-378fae973360-kube-api-access-sqqkk\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.333930 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.386909 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffaaa182-c947-4cbd-b96d-378fae973360" (UID: "ffaaa182-c947-4cbd-b96d-378fae973360"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437008 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qq58\" (UniqueName: \"kubernetes.io/projected/13ed03b8-8758-4c25-b37f-2793697026d2-kube-api-access-7qq58\") pod \"13ed03b8-8758-4c25-b37f-2793697026d2\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437136 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-log-httpd\") pod \"13ed03b8-8758-4c25-b37f-2793697026d2\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437165 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-combined-ca-bundle\") pod \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437198 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-sg-core-conf-yaml\") pod \"13ed03b8-8758-4c25-b37f-2793697026d2\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437216 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-run-httpd\") pod \"13ed03b8-8758-4c25-b37f-2793697026d2\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437301 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-config-data\") pod \"13ed03b8-8758-4c25-b37f-2793697026d2\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437322 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4qjk\" (UniqueName: \"kubernetes.io/projected/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-kube-api-access-x4qjk\") pod \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437339 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data-custom\") pod \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437359 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-combined-ca-bundle\") pod \"13ed03b8-8758-4c25-b37f-2793697026d2\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437387 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-scripts\") pod \"13ed03b8-8758-4c25-b37f-2793697026d2\" (UID: \"13ed03b8-8758-4c25-b37f-2793697026d2\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437441 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data\") pod \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\" (UID: \"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86\") " Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.437832 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.438612 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "13ed03b8-8758-4c25-b37f-2793697026d2" (UID: "13ed03b8-8758-4c25-b37f-2793697026d2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.442257 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "13ed03b8-8758-4c25-b37f-2793697026d2" (UID: "13ed03b8-8758-4c25-b37f-2793697026d2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.451988 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-scripts" (OuterVolumeSpecName: "scripts") pod "13ed03b8-8758-4c25-b37f-2793697026d2" (UID: "13ed03b8-8758-4c25-b37f-2793697026d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.499627 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-kube-api-access-x4qjk" (OuterVolumeSpecName: "kube-api-access-x4qjk") pod "c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" (UID: "c61afd1c-48a7-47f3-b27d-6c6b56e4ce86"). InnerVolumeSpecName "kube-api-access-x4qjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.509709 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13ed03b8-8758-4c25-b37f-2793697026d2-kube-api-access-7qq58" (OuterVolumeSpecName: "kube-api-access-7qq58") pod "13ed03b8-8758-4c25-b37f-2793697026d2" (UID: "13ed03b8-8758-4c25-b37f-2793697026d2"). InnerVolumeSpecName "kube-api-access-7qq58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.531753 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b5cf0ec-4d62-4de9-ad66-f758113443e7" (UID: "6b5cf0ec-4d62-4de9-ad66-f758113443e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.539758 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.539796 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.539808 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qq58\" (UniqueName: \"kubernetes.io/projected/13ed03b8-8758-4c25-b37f-2793697026d2-kube-api-access-7qq58\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.539817 4475 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.539824 4475 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13ed03b8-8758-4c25-b37f-2793697026d2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.539833 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4qjk\" (UniqueName: \"kubernetes.io/projected/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-kube-api-access-x4qjk\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.603636 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" (UID: "c61afd1c-48a7-47f3-b27d-6c6b56e4ce86"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.646086 4475 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.709853 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-88869455c-74p7r"] Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.780304 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7af4-account-create-update-mpdn4"] Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.806823 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tbtn5"] Dec 03 07:01:26 crc kubenswrapper[4475]: I1203 07:01:26.953827 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data" (OuterVolumeSpecName: "config-data") pod "6b5cf0ec-4d62-4de9-ad66-f758113443e7" (UID: "6b5cf0ec-4d62-4de9-ad66-f758113443e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.005782 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5cf0ec-4d62-4de9-ad66-f758113443e7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.021290 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-l6t9x"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.032120 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" (UID: "c61afd1c-48a7-47f3-b27d-6c6b56e4ce86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.034521 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5869676b4b-mfgq2"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.041321 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13ed03b8-8758-4c25-b37f-2793697026d2","Type":"ContainerDied","Data":"fd4b56831fabb1f0ef5e8f8323b47411850d5b389e6244c31c322b47e9eb172c"} Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.041394 4475 scope.go:117] "RemoveContainer" containerID="14c25df4b313807582231974fd3e2f0724e4e3d96f0797158c2d85787b388154" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.041657 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.049685 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbtn5" event={"ID":"ece011ea-6da3-49ca-8dd8-b014f2796157","Type":"ContainerStarted","Data":"487d13d0046aa7141ec10d78c2a992ae1014469d415d91effd362b4c9026717b"} Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.052385 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffaaa182-c947-4cbd-b96d-378fae973360","Type":"ContainerDied","Data":"73990065aae99ce6c01af42a31dca6c52ae93fe6af31c92b25d47333f6c98899"} Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.052496 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.063225 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-8465d48f48-kttgq"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.065287 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "13ed03b8-8758-4c25-b37f-2793697026d2" (UID: "13ed03b8-8758-4c25-b37f-2793697026d2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.065686 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bc0a891f-83ae-4592-ac88-d4bc4359b4d9","Type":"ContainerStarted","Data":"5a7d46c81e81e9862097c5557ba90f4f970cd5fcaec55fe32ee584983e1d42d3"} Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.071466 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-t8lwx"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.092986 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13ed03b8-8758-4c25-b37f-2793697026d2" (UID: "13ed03b8-8758-4c25-b37f-2793697026d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.093385 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" event={"ID":"88fc5e9a-b3f3-421c-9b93-939e11a9f81f","Type":"ContainerStarted","Data":"838d039cf647c8f22883f9643219fa6c5e9de457dc722ec2ba0e17526c6ff11f"} Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.094661 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.103775 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" event={"ID":"6b5cf0ec-4d62-4de9-ad66-f758113443e7","Type":"ContainerDied","Data":"f9481c5cb082cf4ab9451fb99c086ae8d7ffd1a17c5b5c87d7bdcc55946eb684"} Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.103885 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.112490 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.112511 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.112520 4475 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.121521 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c87685856-7spht" event={"ID":"c61afd1c-48a7-47f3-b27d-6c6b56e4ce86","Type":"ContainerDied","Data":"d0f46804a2408743f46d79fd37d217a81c23770403618e4ef4c9950af5e33b6a"} Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.121632 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c87685856-7spht" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.122312 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.197784676 podStartE2EDuration="18.122292747s" podCreationTimestamp="2025-12-03 07:01:09 +0000 UTC" firstStartedPulling="2025-12-03 07:01:10.713157399 +0000 UTC m=+955.518055733" lastFinishedPulling="2025-12-03 07:01:25.63766547 +0000 UTC m=+970.442563804" observedRunningTime="2025-12-03 07:01:27.094968739 +0000 UTC m=+971.899867073" watchObservedRunningTime="2025-12-03 07:01:27.122292747 +0000 UTC m=+971.927191080" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.124408 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-88869455c-74p7r" event={"ID":"16bbbd37-150f-4b54-8fc1-eb7708ecca88","Type":"ContainerStarted","Data":"b96168d7c7de95dcc07cab46b70c1146b836864b2237722cf2b7601a152d7b31"} Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.127351 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" podStartSLOduration=11.127343247 podStartE2EDuration="11.127343247s" podCreationTimestamp="2025-12-03 07:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:27.119048053 +0000 UTC m=+971.923946388" watchObservedRunningTime="2025-12-03 07:01:27.127343247 +0000 UTC m=+971.932241581" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.130805 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-74bbb49977-5dq74" event={"ID":"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d","Type":"ContainerStarted","Data":"b829e8bae3ecfb33d0039b80148cb037437b695ce94d68cbd40f21e1a3e56779"} Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.131671 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.168836 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7af4-account-create-update-mpdn4" event={"ID":"37cea503-9fad-48b6-9ec0-b8957e5420f1","Type":"ContainerStarted","Data":"97e05fed1b7456228f73b3917c4ab42512d58417ddad2ba9f235f6e6a95293bc"} Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.205789 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data" (OuterVolumeSpecName: "config-data") pod "ffaaa182-c947-4cbd-b96d-378fae973360" (UID: "ffaaa182-c947-4cbd-b96d-378fae973360"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.234285 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffaaa182-c947-4cbd-b96d-378fae973360-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.251247 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data" (OuterVolumeSpecName: "config-data") pod "c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" (UID: "c61afd1c-48a7-47f3-b27d-6c6b56e4ce86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.265530 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-74bbb49977-5dq74" podStartSLOduration=11.265513381 podStartE2EDuration="11.265513381s" podCreationTimestamp="2025-12-03 07:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:27.167647862 +0000 UTC m=+971.972546206" watchObservedRunningTime="2025-12-03 07:01:27.265513381 +0000 UTC m=+972.070411715" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.272496 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4b91-account-create-update-rv6lt"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.279141 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6ad2-account-create-update-phxm5"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.336301 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.347305 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-config-data" (OuterVolumeSpecName: "config-data") pod "13ed03b8-8758-4c25-b37f-2793697026d2" (UID: "13ed03b8-8758-4c25-b37f-2793697026d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.437573 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13ed03b8-8758-4c25-b37f-2793697026d2-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.485952 4475 scope.go:117] "RemoveContainer" containerID="48b539995f961f39f0f3784b46e123e5caf6c9bfe6e2f1621226b6e95eb4a268" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.567265 4475 scope.go:117] "RemoveContainer" containerID="74a1bf735d06e69f49a7a5836c0ac147e44657c1057d939890bcb84df13ddf46" Dec 03 07:01:27 crc kubenswrapper[4475]: E1203 07:01:27.630102 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffaaa182_c947_4cbd_b96d_378fae973360.slice/crio-73990065aae99ce6c01af42a31dca6c52ae93fe6af31c92b25d47333f6c98899\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc61afd1c_48a7_47f3_b27d_6c6b56e4ce86.slice/crio-d0f46804a2408743f46d79fd37d217a81c23770403618e4ef4c9950af5e33b6a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00f99628_c2ec_48c9_b266_0f14cbf05570.slice/crio-86923467babacbdac85b8413f4adc00e5627cf676ead74fad9c3c535839a95ac\": RecentStats: unable to find data in memory cache]" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.703499 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.714756 4475 scope.go:117] "RemoveContainer" containerID="314eebb9f24bd7ca6ede27d5040984fbf03c937af31d92e1f4dfa30c219712b4" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.745634 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.772755 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:01:27 crc kubenswrapper[4475]: E1203 07:01:27.773691 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" containerName="heat-api" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.773705 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" containerName="heat-api" Dec 03 07:01:27 crc kubenswrapper[4475]: E1203 07:01:27.773718 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="ceilometer-central-agent" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.773726 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="ceilometer-central-agent" Dec 03 07:01:27 crc kubenswrapper[4475]: E1203 07:01:27.773735 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="proxy-httpd" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.773740 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="proxy-httpd" Dec 03 07:01:27 crc kubenswrapper[4475]: E1203 07:01:27.773751 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f99628-c2ec-48c9-b266-0f14cbf05570" containerName="dnsmasq-dns" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.773756 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f99628-c2ec-48c9-b266-0f14cbf05570" containerName="dnsmasq-dns" Dec 03 07:01:27 crc kubenswrapper[4475]: E1203 07:01:27.773767 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f99628-c2ec-48c9-b266-0f14cbf05570" containerName="init" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.773772 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f99628-c2ec-48c9-b266-0f14cbf05570" containerName="init" Dec 03 07:01:27 crc kubenswrapper[4475]: E1203 07:01:27.773786 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffaaa182-c947-4cbd-b96d-378fae973360" containerName="cinder-api-log" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.773791 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffaaa182-c947-4cbd-b96d-378fae973360" containerName="cinder-api-log" Dec 03 07:01:27 crc kubenswrapper[4475]: E1203 07:01:27.773808 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="sg-core" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.773813 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="sg-core" Dec 03 07:01:27 crc kubenswrapper[4475]: E1203 07:01:27.773824 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="ceilometer-notification-agent" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.773830 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="ceilometer-notification-agent" Dec 03 07:01:27 crc kubenswrapper[4475]: E1203 07:01:27.773841 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5cf0ec-4d62-4de9-ad66-f758113443e7" containerName="heat-cfnapi" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.773846 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5cf0ec-4d62-4de9-ad66-f758113443e7" containerName="heat-cfnapi" Dec 03 07:01:27 crc kubenswrapper[4475]: E1203 07:01:27.773855 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffaaa182-c947-4cbd-b96d-378fae973360" containerName="cinder-api" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.773861 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffaaa182-c947-4cbd-b96d-378fae973360" containerName="cinder-api" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.774018 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="sg-core" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.774037 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="ceilometer-central-agent" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.774050 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffaaa182-c947-4cbd-b96d-378fae973360" containerName="cinder-api" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.774061 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="00f99628-c2ec-48c9-b266-0f14cbf05570" containerName="dnsmasq-dns" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.774068 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffaaa182-c947-4cbd-b96d-378fae973360" containerName="cinder-api-log" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.774075 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="ceilometer-notification-agent" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.774087 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" containerName="heat-api" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.774096 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" containerName="proxy-httpd" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.774102 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5cf0ec-4d62-4de9-ad66-f758113443e7" containerName="heat-cfnapi" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.777168 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.785344 4475 scope.go:117] "RemoveContainer" containerID="a342851e48b4f81a5ac7652402e4a05cdd8410849baf25d75ddc6e72b7854358" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.785717 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.797946 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.798156 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.850850 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58d4897ff7-tjzk8"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.859468 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.859534 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n85p2\" (UniqueName: \"kubernetes.io/projected/d8163420-efd0-4080-9529-f67b6bcb689f-kube-api-access-n85p2\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.859584 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-scripts\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.859624 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-config-data\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.859695 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.859741 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-config-data-custom\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.859816 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8163420-efd0-4080-9529-f67b6bcb689f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.859841 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8163420-efd0-4080-9529-f67b6bcb689f-logs\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.859873 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.870493 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-58d4897ff7-tjzk8"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.882598 4475 scope.go:117] "RemoveContainer" containerID="15564d5c21e647eb083975141782f65b426ec7252db05a340c200804edf55f53" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.891213 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.897353 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6c87685856-7spht"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.902377 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6c87685856-7spht"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.903622 4475 scope.go:117] "RemoveContainer" containerID="ac7e14f9da3989dbfdc8572701d4cd54c6c606817f1b3bcd3ae56c896a5a8d03" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.908614 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5967d8988f-ghdmf"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.914363 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5967d8988f-ghdmf"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.918965 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.923586 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.923735 4475 scope.go:117] "RemoveContainer" containerID="cc503027a80c1b9e5c723fde9fd83b6e889ebccf00d47eea3a2911427649fddb" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.935862 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.938235 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.940262 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.943262 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.946650 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.961149 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-scripts\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.961196 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.961231 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n85p2\" (UniqueName: \"kubernetes.io/projected/d8163420-efd0-4080-9529-f67b6bcb689f-kube-api-access-n85p2\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.961257 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-run-httpd\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.962160 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-scripts\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.962195 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-config-data\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.963035 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-log-httpd\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.963152 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.963184 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-config-data-custom\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.963213 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-config-data\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.963232 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.963345 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zdcb\" (UniqueName: \"kubernetes.io/projected/13335c23-2f7e-4387-9a60-1ba04b5fd219-kube-api-access-8zdcb\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.963367 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.963384 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8163420-efd0-4080-9529-f67b6bcb689f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.963399 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8163420-efd0-4080-9529-f67b6bcb689f-logs\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.963712 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.972463 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.979751 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-config-data\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.980269 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8163420-efd0-4080-9529-f67b6bcb689f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.980518 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8163420-efd0-4080-9529-f67b6bcb689f-logs\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.981719 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-scripts\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.983242 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.987339 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.988116 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8163420-efd0-4080-9529-f67b6bcb689f-config-data-custom\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:27 crc kubenswrapper[4475]: I1203 07:01:27.988656 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n85p2\" (UniqueName: \"kubernetes.io/projected/d8163420-efd0-4080-9529-f67b6bcb689f-kube-api-access-n85p2\") pod \"cinder-api-0\" (UID: \"d8163420-efd0-4080-9529-f67b6bcb689f\") " pod="openstack/cinder-api-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.064870 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-scripts\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.064931 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-run-httpd\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.064973 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-log-httpd\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.065012 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-config-data\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.065045 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.065069 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zdcb\" (UniqueName: \"kubernetes.io/projected/13335c23-2f7e-4387-9a60-1ba04b5fd219-kube-api-access-8zdcb\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.065086 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.065722 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-log-httpd\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.066767 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-run-httpd\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.069522 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.070995 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-scripts\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.072152 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.073142 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-config-data\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.081568 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zdcb\" (UniqueName: \"kubernetes.io/projected/13335c23-2f7e-4387-9a60-1ba04b5fd219-kube-api-access-8zdcb\") pod \"ceilometer-0\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.133943 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.187797 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" event={"ID":"4ad132bf-e35e-48e6-b406-0aaca969a684","Type":"ContainerStarted","Data":"0c07a33c01541f1d2756e0fd7c843af2ed8229aa4620309e658291066f619218"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.187834 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" event={"ID":"4ad132bf-e35e-48e6-b406-0aaca969a684","Type":"ContainerStarted","Data":"7f4b4c6c3e14d9a927c52851ba5d7f4d55eb209908063413fe149e8543155e51"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.192391 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbtn5" event={"ID":"ece011ea-6da3-49ca-8dd8-b014f2796157","Type":"ContainerStarted","Data":"ec83ea70093335e49687050a5b3b17b7bd1bc87e08b470bf3d3167448aaa275a"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.218881 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" podStartSLOduration=6.218865938 podStartE2EDuration="6.218865938s" podCreationTimestamp="2025-12-03 07:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:28.213999334 +0000 UTC m=+973.018897668" watchObservedRunningTime="2025-12-03 07:01:28.218865938 +0000 UTC m=+973.023764273" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.224044 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t8lwx" event={"ID":"fae26675-9566-4902-af26-0247c3f6164b","Type":"ContainerStarted","Data":"b5be679bc8c01203210fe0862b509d3b6f47528e3debb780752340d07fd241dc"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.224088 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t8lwx" event={"ID":"fae26675-9566-4902-af26-0247c3f6164b","Type":"ContainerStarted","Data":"85370ea0754b0be973aeffcc9728e3a53440e37e3fc0f09cded5175925662729"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.229940 4475 generic.go:334] "Generic (PLEG): container finished" podID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" containerID="838d039cf647c8f22883f9643219fa6c5e9de457dc722ec2ba0e17526c6ff11f" exitCode=1 Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.230009 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" event={"ID":"88fc5e9a-b3f3-421c-9b93-939e11a9f81f","Type":"ContainerDied","Data":"838d039cf647c8f22883f9643219fa6c5e9de457dc722ec2ba0e17526c6ff11f"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.230046 4475 scope.go:117] "RemoveContainer" containerID="98f32a0f5e7db47ee82aa63b9fc5094dc95aee4de0b651d226733b7b866440da" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.230378 4475 scope.go:117] "RemoveContainer" containerID="838d039cf647c8f22883f9643219fa6c5e9de457dc722ec2ba0e17526c6ff11f" Dec 03 07:01:28 crc kubenswrapper[4475]: E1203 07:01:28.230625 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7d7bff4588-wzmxn_openstack(88fc5e9a-b3f3-421c-9b93-939e11a9f81f)\"" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" podUID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.237794 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-l6t9x" event={"ID":"d5b7dc01-eee2-4b3a-b834-3092016603d3","Type":"ContainerStarted","Data":"1c58d9d93461ebc04d6cbedbd057f82787a224606cc13bf3d2487e0d96fb94b9"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.237886 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-l6t9x" event={"ID":"d5b7dc01-eee2-4b3a-b834-3092016603d3","Type":"ContainerStarted","Data":"fcf179ab62a188e053c62c6bb1491e14b8b097e403b18a7c25740c633cf01bed"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.238297 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-tbtn5" podStartSLOduration=7.238288072 podStartE2EDuration="7.238288072s" podCreationTimestamp="2025-12-03 07:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:28.225119932 +0000 UTC m=+973.030018265" watchObservedRunningTime="2025-12-03 07:01:28.238288072 +0000 UTC m=+973.043186406" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.247110 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-t8lwx" podStartSLOduration=7.247099687 podStartE2EDuration="7.247099687s" podCreationTimestamp="2025-12-03 07:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:28.245787188 +0000 UTC m=+973.050685522" watchObservedRunningTime="2025-12-03 07:01:28.247099687 +0000 UTC m=+973.051998021" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.267342 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-l6t9x" podStartSLOduration=7.267331643 podStartE2EDuration="7.267331643s" podCreationTimestamp="2025-12-03 07:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:28.263417502 +0000 UTC m=+973.068315836" watchObservedRunningTime="2025-12-03 07:01:28.267331643 +0000 UTC m=+973.072229968" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.268590 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.274311 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7af4-account-create-update-mpdn4" event={"ID":"37cea503-9fad-48b6-9ec0-b8957e5420f1","Type":"ContainerStarted","Data":"36e8e62b17059dfec7d5f9200f2dd8bf1e79c80ecfbeaacd0191064276c95c27"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.276723 4475 generic.go:334] "Generic (PLEG): container finished" podID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" containerID="b829e8bae3ecfb33d0039b80148cb037437b695ce94d68cbd40f21e1a3e56779" exitCode=1 Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.276848 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-74bbb49977-5dq74" event={"ID":"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d","Type":"ContainerDied","Data":"b829e8bae3ecfb33d0039b80148cb037437b695ce94d68cbd40f21e1a3e56779"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.276937 4475 scope.go:117] "RemoveContainer" containerID="c529f313f3e60c39a120a6bae259542f17bdfd9cfea7343f6afff36f2d135317" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.277329 4475 scope.go:117] "RemoveContainer" containerID="b829e8bae3ecfb33d0039b80148cb037437b695ce94d68cbd40f21e1a3e56779" Dec 03 07:01:28 crc kubenswrapper[4475]: E1203 07:01:28.277585 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-74bbb49977-5dq74_openstack(b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d)\"" pod="openstack/heat-api-74bbb49977-5dq74" podUID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.283703 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5869676b4b-mfgq2" event={"ID":"21c23182-6d85-4951-b96a-072f0d04bae6","Type":"ContainerStarted","Data":"c59fc9762d715a4c21fdeccd0efda204774f13f252110a37c7f37b134e3c63e3"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.283747 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5869676b4b-mfgq2" event={"ID":"21c23182-6d85-4951-b96a-072f0d04bae6","Type":"ContainerStarted","Data":"1c76fd0775847c46cd673a87bbc39d2ce0bb4cf2f1676bc9b61269c9279ca69f"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.284577 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.285444 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" event={"ID":"161d281f-1845-400c-bbda-691d6681cc69","Type":"ContainerStarted","Data":"c990880b947c6666048aaa8c294078b714269dfaf584ac073181ad2c2924030d"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.285566 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" event={"ID":"161d281f-1845-400c-bbda-691d6681cc69","Type":"ContainerStarted","Data":"ad5ecc0de0676056789de81fd53d0087f886caee878d9ba431b1e7787e690f84"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.287729 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-88869455c-74p7r" event={"ID":"16bbbd37-150f-4b54-8fc1-eb7708ecca88","Type":"ContainerStarted","Data":"644956e0aee4ff95c7c721ed241226b16d2c07a7377e6c441290c08d65bb9a4e"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.291530 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8465d48f48-kttgq" event={"ID":"89ecee9c-2a17-4d2b-bd66-78a1542d0f57","Type":"ContainerStarted","Data":"66cff671822751adb6f3da80fb9edda031b97d9aee427f6ff1cd2886dfac10c0"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.291620 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8465d48f48-kttgq" event={"ID":"89ecee9c-2a17-4d2b-bd66-78a1542d0f57","Type":"ContainerStarted","Data":"0d7dd2facdb725eb05ba42b32bff2f6ba3ae7c4f8bfeb0b41bc1f934a3dbd04c"} Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.291681 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.310865 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-7af4-account-create-update-mpdn4" podStartSLOduration=7.310848081 podStartE2EDuration="7.310848081s" podCreationTimestamp="2025-12-03 07:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:28.302938122 +0000 UTC m=+973.107836456" watchObservedRunningTime="2025-12-03 07:01:28.310848081 +0000 UTC m=+973.115746415" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.411608 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5869676b4b-mfgq2" podStartSLOduration=9.411591883 podStartE2EDuration="9.411591883s" podCreationTimestamp="2025-12-03 07:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:28.366191473 +0000 UTC m=+973.171089807" watchObservedRunningTime="2025-12-03 07:01:28.411591883 +0000 UTC m=+973.216490218" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.461665 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" podStartSLOduration=6.461646959 podStartE2EDuration="6.461646959s" podCreationTimestamp="2025-12-03 07:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:28.390655432 +0000 UTC m=+973.195553765" watchObservedRunningTime="2025-12-03 07:01:28.461646959 +0000 UTC m=+973.266545283" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.487156 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-8465d48f48-kttgq" podStartSLOduration=9.487140605 podStartE2EDuration="9.487140605s" podCreationTimestamp="2025-12-03 07:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:28.409975103 +0000 UTC m=+973.214873437" watchObservedRunningTime="2025-12-03 07:01:28.487140605 +0000 UTC m=+973.292038938" Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.735958 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:01:28 crc kubenswrapper[4475]: W1203 07:01:28.752600 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8163420_efd0_4080_9529_f67b6bcb689f.slice/crio-f6ee1b1584485c383ef0b38b7c331c78e988045113d4ef067f6b2229d23393ee WatchSource:0}: Error finding container f6ee1b1584485c383ef0b38b7c331c78e988045113d4ef067f6b2229d23393ee: Status 404 returned error can't find the container with id f6ee1b1584485c383ef0b38b7c331c78e988045113d4ef067f6b2229d23393ee Dec 03 07:01:28 crc kubenswrapper[4475]: I1203 07:01:28.897689 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:28 crc kubenswrapper[4475]: W1203 07:01:28.903557 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13335c23_2f7e_4387_9a60_1ba04b5fd219.slice/crio-9a32645ce2929d9428bfdf88a86c2d3660d71b8ba69725a2a4f0b023defe05a4 WatchSource:0}: Error finding container 9a32645ce2929d9428bfdf88a86c2d3660d71b8ba69725a2a4f0b023defe05a4: Status 404 returned error can't find the container with id 9a32645ce2929d9428bfdf88a86c2d3660d71b8ba69725a2a4f0b023defe05a4 Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.312507 4475 generic.go:334] "Generic (PLEG): container finished" podID="161d281f-1845-400c-bbda-691d6681cc69" containerID="c990880b947c6666048aaa8c294078b714269dfaf584ac073181ad2c2924030d" exitCode=0 Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.312784 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" event={"ID":"161d281f-1845-400c-bbda-691d6681cc69","Type":"ContainerDied","Data":"c990880b947c6666048aaa8c294078b714269dfaf584ac073181ad2c2924030d"} Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.316080 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8163420-efd0-4080-9529-f67b6bcb689f","Type":"ContainerStarted","Data":"7d52080850ae7adc322fd81f659e51b35fa0c129657afd9e7177b7c24dbcae91"} Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.316109 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8163420-efd0-4080-9529-f67b6bcb689f","Type":"ContainerStarted","Data":"f6ee1b1584485c383ef0b38b7c331c78e988045113d4ef067f6b2229d23393ee"} Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.318083 4475 generic.go:334] "Generic (PLEG): container finished" podID="fae26675-9566-4902-af26-0247c3f6164b" containerID="b5be679bc8c01203210fe0862b509d3b6f47528e3debb780752340d07fd241dc" exitCode=0 Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.318126 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t8lwx" event={"ID":"fae26675-9566-4902-af26-0247c3f6164b","Type":"ContainerDied","Data":"b5be679bc8c01203210fe0862b509d3b6f47528e3debb780752340d07fd241dc"} Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.319904 4475 scope.go:117] "RemoveContainer" containerID="b829e8bae3ecfb33d0039b80148cb037437b695ce94d68cbd40f21e1a3e56779" Dec 03 07:01:29 crc kubenswrapper[4475]: E1203 07:01:29.320146 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-74bbb49977-5dq74_openstack(b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d)\"" pod="openstack/heat-api-74bbb49977-5dq74" podUID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.321877 4475 scope.go:117] "RemoveContainer" containerID="838d039cf647c8f22883f9643219fa6c5e9de457dc722ec2ba0e17526c6ff11f" Dec 03 07:01:29 crc kubenswrapper[4475]: E1203 07:01:29.322049 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7d7bff4588-wzmxn_openstack(88fc5e9a-b3f3-421c-9b93-939e11a9f81f)\"" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" podUID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.323146 4475 generic.go:334] "Generic (PLEG): container finished" podID="d5b7dc01-eee2-4b3a-b834-3092016603d3" containerID="1c58d9d93461ebc04d6cbedbd057f82787a224606cc13bf3d2487e0d96fb94b9" exitCode=0 Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.323189 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-l6t9x" event={"ID":"d5b7dc01-eee2-4b3a-b834-3092016603d3","Type":"ContainerDied","Data":"1c58d9d93461ebc04d6cbedbd057f82787a224606cc13bf3d2487e0d96fb94b9"} Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.326401 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-88869455c-74p7r" event={"ID":"16bbbd37-150f-4b54-8fc1-eb7708ecca88","Type":"ContainerStarted","Data":"d798a5bfd9b0c067fac62f5cbe5994ef4dba9d8d3ada6aee88cbbb7cac539827"} Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.327295 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.327327 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.332314 4475 generic.go:334] "Generic (PLEG): container finished" podID="37cea503-9fad-48b6-9ec0-b8957e5420f1" containerID="36e8e62b17059dfec7d5f9200f2dd8bf1e79c80ecfbeaacd0191064276c95c27" exitCode=0 Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.332362 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7af4-account-create-update-mpdn4" event={"ID":"37cea503-9fad-48b6-9ec0-b8957e5420f1","Type":"ContainerDied","Data":"36e8e62b17059dfec7d5f9200f2dd8bf1e79c80ecfbeaacd0191064276c95c27"} Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.342334 4475 generic.go:334] "Generic (PLEG): container finished" podID="4ad132bf-e35e-48e6-b406-0aaca969a684" containerID="0c07a33c01541f1d2756e0fd7c843af2ed8229aa4620309e658291066f619218" exitCode=0 Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.342404 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" event={"ID":"4ad132bf-e35e-48e6-b406-0aaca969a684","Type":"ContainerDied","Data":"0c07a33c01541f1d2756e0fd7c843af2ed8229aa4620309e658291066f619218"} Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.347048 4475 generic.go:334] "Generic (PLEG): container finished" podID="ece011ea-6da3-49ca-8dd8-b014f2796157" containerID="ec83ea70093335e49687050a5b3b17b7bd1bc87e08b470bf3d3167448aaa275a" exitCode=0 Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.347091 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbtn5" event={"ID":"ece011ea-6da3-49ca-8dd8-b014f2796157","Type":"ContainerDied","Data":"ec83ea70093335e49687050a5b3b17b7bd1bc87e08b470bf3d3167448aaa275a"} Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.354195 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13335c23-2f7e-4387-9a60-1ba04b5fd219","Type":"ContainerStarted","Data":"9a32645ce2929d9428bfdf88a86c2d3660d71b8ba69725a2a4f0b023defe05a4"} Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.466067 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-88869455c-74p7r" podStartSLOduration=10.466049644 podStartE2EDuration="10.466049644s" podCreationTimestamp="2025-12-03 07:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:29.419413148 +0000 UTC m=+974.224311482" watchObservedRunningTime="2025-12-03 07:01:29.466049644 +0000 UTC m=+974.270947978" Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.502815 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00f99628-c2ec-48c9-b266-0f14cbf05570" path="/var/lib/kubelet/pods/00f99628-c2ec-48c9-b266-0f14cbf05570/volumes" Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.503472 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13ed03b8-8758-4c25-b37f-2793697026d2" path="/var/lib/kubelet/pods/13ed03b8-8758-4c25-b37f-2793697026d2/volumes" Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.505442 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b5cf0ec-4d62-4de9-ad66-f758113443e7" path="/var/lib/kubelet/pods/6b5cf0ec-4d62-4de9-ad66-f758113443e7/volumes" Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.506199 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" path="/var/lib/kubelet/pods/c61afd1c-48a7-47f3-b27d-6c6b56e4ce86/volumes" Dec 03 07:01:29 crc kubenswrapper[4475]: I1203 07:01:29.507172 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffaaa182-c947-4cbd-b96d-378fae973360" path="/var/lib/kubelet/pods/ffaaa182-c947-4cbd-b96d-378fae973360/volumes" Dec 03 07:01:30 crc kubenswrapper[4475]: I1203 07:01:30.361300 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13335c23-2f7e-4387-9a60-1ba04b5fd219","Type":"ContainerStarted","Data":"0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211"} Dec 03 07:01:30 crc kubenswrapper[4475]: I1203 07:01:30.364612 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8163420-efd0-4080-9529-f67b6bcb689f","Type":"ContainerStarted","Data":"d511aea38f750e60a8239f277e54a0ae2e8d4b49b9fbde01db54c1343f37a35f"} Dec 03 07:01:30 crc kubenswrapper[4475]: I1203 07:01:30.365367 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 07:01:30 crc kubenswrapper[4475]: I1203 07:01:30.401361 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.401344353 podStartE2EDuration="3.401344353s" podCreationTimestamp="2025-12-03 07:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:30.400807183 +0000 UTC m=+975.205705517" watchObservedRunningTime="2025-12-03 07:01:30.401344353 +0000 UTC m=+975.206242687" Dec 03 07:01:30 crc kubenswrapper[4475]: I1203 07:01:30.905933 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.724775 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7af4-account-create-update-mpdn4" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.731802 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l6t9x" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.750331 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.776023 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t8lwx" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.780250 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbtn5" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.806055 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.860136 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bt9b\" (UniqueName: \"kubernetes.io/projected/161d281f-1845-400c-bbda-691d6681cc69-kube-api-access-8bt9b\") pod \"161d281f-1845-400c-bbda-691d6681cc69\" (UID: \"161d281f-1845-400c-bbda-691d6681cc69\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.860191 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4g88\" (UniqueName: \"kubernetes.io/projected/d5b7dc01-eee2-4b3a-b834-3092016603d3-kube-api-access-j4g88\") pod \"d5b7dc01-eee2-4b3a-b834-3092016603d3\" (UID: \"d5b7dc01-eee2-4b3a-b834-3092016603d3\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.862966 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/161d281f-1845-400c-bbda-691d6681cc69-kube-api-access-8bt9b" (OuterVolumeSpecName: "kube-api-access-8bt9b") pod "161d281f-1845-400c-bbda-691d6681cc69" (UID: "161d281f-1845-400c-bbda-691d6681cc69"). InnerVolumeSpecName "kube-api-access-8bt9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.865060 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/161d281f-1845-400c-bbda-691d6681cc69-operator-scripts\") pod \"161d281f-1845-400c-bbda-691d6681cc69\" (UID: \"161d281f-1845-400c-bbda-691d6681cc69\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.865188 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37cea503-9fad-48b6-9ec0-b8957e5420f1-operator-scripts\") pod \"37cea503-9fad-48b6-9ec0-b8957e5420f1\" (UID: \"37cea503-9fad-48b6-9ec0-b8957e5420f1\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.865309 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5b7dc01-eee2-4b3a-b834-3092016603d3-operator-scripts\") pod \"d5b7dc01-eee2-4b3a-b834-3092016603d3\" (UID: \"d5b7dc01-eee2-4b3a-b834-3092016603d3\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.865352 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7crv\" (UniqueName: \"kubernetes.io/projected/37cea503-9fad-48b6-9ec0-b8957e5420f1-kube-api-access-c7crv\") pod \"37cea503-9fad-48b6-9ec0-b8957e5420f1\" (UID: \"37cea503-9fad-48b6-9ec0-b8957e5420f1\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.866289 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bt9b\" (UniqueName: \"kubernetes.io/projected/161d281f-1845-400c-bbda-691d6681cc69-kube-api-access-8bt9b\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.866681 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/161d281f-1845-400c-bbda-691d6681cc69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "161d281f-1845-400c-bbda-691d6681cc69" (UID: "161d281f-1845-400c-bbda-691d6681cc69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.867193 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b7dc01-eee2-4b3a-b834-3092016603d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5b7dc01-eee2-4b3a-b834-3092016603d3" (UID: "d5b7dc01-eee2-4b3a-b834-3092016603d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.867614 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37cea503-9fad-48b6-9ec0-b8957e5420f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37cea503-9fad-48b6-9ec0-b8957e5420f1" (UID: "37cea503-9fad-48b6-9ec0-b8957e5420f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.871631 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b7dc01-eee2-4b3a-b834-3092016603d3-kube-api-access-j4g88" (OuterVolumeSpecName: "kube-api-access-j4g88") pod "d5b7dc01-eee2-4b3a-b834-3092016603d3" (UID: "d5b7dc01-eee2-4b3a-b834-3092016603d3"). InnerVolumeSpecName "kube-api-access-j4g88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.874710 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37cea503-9fad-48b6-9ec0-b8957e5420f1-kube-api-access-c7crv" (OuterVolumeSpecName: "kube-api-access-c7crv") pod "37cea503-9fad-48b6-9ec0-b8957e5420f1" (UID: "37cea503-9fad-48b6-9ec0-b8957e5420f1"). InnerVolumeSpecName "kube-api-access-c7crv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.967478 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr5pw\" (UniqueName: \"kubernetes.io/projected/ece011ea-6da3-49ca-8dd8-b014f2796157-kube-api-access-gr5pw\") pod \"ece011ea-6da3-49ca-8dd8-b014f2796157\" (UID: \"ece011ea-6da3-49ca-8dd8-b014f2796157\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.967677 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qgh4\" (UniqueName: \"kubernetes.io/projected/fae26675-9566-4902-af26-0247c3f6164b-kube-api-access-8qgh4\") pod \"fae26675-9566-4902-af26-0247c3f6164b\" (UID: \"fae26675-9566-4902-af26-0247c3f6164b\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.967743 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad132bf-e35e-48e6-b406-0aaca969a684-operator-scripts\") pod \"4ad132bf-e35e-48e6-b406-0aaca969a684\" (UID: \"4ad132bf-e35e-48e6-b406-0aaca969a684\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.967774 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece011ea-6da3-49ca-8dd8-b014f2796157-operator-scripts\") pod \"ece011ea-6da3-49ca-8dd8-b014f2796157\" (UID: \"ece011ea-6da3-49ca-8dd8-b014f2796157\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.967822 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdzkt\" (UniqueName: \"kubernetes.io/projected/4ad132bf-e35e-48e6-b406-0aaca969a684-kube-api-access-fdzkt\") pod \"4ad132bf-e35e-48e6-b406-0aaca969a684\" (UID: \"4ad132bf-e35e-48e6-b406-0aaca969a684\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.967903 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fae26675-9566-4902-af26-0247c3f6164b-operator-scripts\") pod \"fae26675-9566-4902-af26-0247c3f6164b\" (UID: \"fae26675-9566-4902-af26-0247c3f6164b\") " Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.968298 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ad132bf-e35e-48e6-b406-0aaca969a684-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ad132bf-e35e-48e6-b406-0aaca969a684" (UID: "4ad132bf-e35e-48e6-b406-0aaca969a684"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.968318 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/161d281f-1845-400c-bbda-691d6681cc69-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.968331 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37cea503-9fad-48b6-9ec0-b8957e5420f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.968339 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5b7dc01-eee2-4b3a-b834-3092016603d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.968347 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7crv\" (UniqueName: \"kubernetes.io/projected/37cea503-9fad-48b6-9ec0-b8957e5420f1-kube-api-access-c7crv\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.968358 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4g88\" (UniqueName: \"kubernetes.io/projected/d5b7dc01-eee2-4b3a-b834-3092016603d3-kube-api-access-j4g88\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.968822 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ece011ea-6da3-49ca-8dd8-b014f2796157-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ece011ea-6da3-49ca-8dd8-b014f2796157" (UID: "ece011ea-6da3-49ca-8dd8-b014f2796157"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.968974 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fae26675-9566-4902-af26-0247c3f6164b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fae26675-9566-4902-af26-0247c3f6164b" (UID: "fae26675-9566-4902-af26-0247c3f6164b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.970898 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece011ea-6da3-49ca-8dd8-b014f2796157-kube-api-access-gr5pw" (OuterVolumeSpecName: "kube-api-access-gr5pw") pod "ece011ea-6da3-49ca-8dd8-b014f2796157" (UID: "ece011ea-6da3-49ca-8dd8-b014f2796157"). InnerVolumeSpecName "kube-api-access-gr5pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.972858 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad132bf-e35e-48e6-b406-0aaca969a684-kube-api-access-fdzkt" (OuterVolumeSpecName: "kube-api-access-fdzkt") pod "4ad132bf-e35e-48e6-b406-0aaca969a684" (UID: "4ad132bf-e35e-48e6-b406-0aaca969a684"). InnerVolumeSpecName "kube-api-access-fdzkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:31 crc kubenswrapper[4475]: I1203 07:01:31.974063 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae26675-9566-4902-af26-0247c3f6164b-kube-api-access-8qgh4" (OuterVolumeSpecName: "kube-api-access-8qgh4") pod "fae26675-9566-4902-af26-0247c3f6164b" (UID: "fae26675-9566-4902-af26-0247c3f6164b"). InnerVolumeSpecName "kube-api-access-8qgh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.034536 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.035281 4475 scope.go:117] "RemoveContainer" containerID="b829e8bae3ecfb33d0039b80148cb037437b695ce94d68cbd40f21e1a3e56779" Dec 03 07:01:32 crc kubenswrapper[4475]: E1203 07:01:32.035605 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-74bbb49977-5dq74_openstack(b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d)\"" pod="openstack/heat-api-74bbb49977-5dq74" podUID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.042859 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.043498 4475 scope.go:117] "RemoveContainer" containerID="838d039cf647c8f22883f9643219fa6c5e9de457dc722ec2ba0e17526c6ff11f" Dec 03 07:01:32 crc kubenswrapper[4475]: E1203 07:01:32.043706 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7d7bff4588-wzmxn_openstack(88fc5e9a-b3f3-421c-9b93-939e11a9f81f)\"" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" podUID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.069824 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr5pw\" (UniqueName: \"kubernetes.io/projected/ece011ea-6da3-49ca-8dd8-b014f2796157-kube-api-access-gr5pw\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.069853 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qgh4\" (UniqueName: \"kubernetes.io/projected/fae26675-9566-4902-af26-0247c3f6164b-kube-api-access-8qgh4\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.069863 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad132bf-e35e-48e6-b406-0aaca969a684-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.069873 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece011ea-6da3-49ca-8dd8-b014f2796157-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.069883 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdzkt\" (UniqueName: \"kubernetes.io/projected/4ad132bf-e35e-48e6-b406-0aaca969a684-kube-api-access-fdzkt\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.069891 4475 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fae26675-9566-4902-af26-0247c3f6164b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.099798 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.272607 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-scripts\") pod \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.272685 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-logs\") pod \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.272751 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-combined-ca-bundle\") pod \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.272808 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-tls-certs\") pod \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.272866 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-config-data\") pod \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.272883 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-secret-key\") pod \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.272917 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-248ht\" (UniqueName: \"kubernetes.io/projected/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-kube-api-access-248ht\") pod \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\" (UID: \"2401beb9-38b8-4581-b9a2-8bb16e15e6c1\") " Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.273956 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-logs" (OuterVolumeSpecName: "logs") pod "2401beb9-38b8-4581-b9a2-8bb16e15e6c1" (UID: "2401beb9-38b8-4581-b9a2-8bb16e15e6c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.277327 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-kube-api-access-248ht" (OuterVolumeSpecName: "kube-api-access-248ht") pod "2401beb9-38b8-4581-b9a2-8bb16e15e6c1" (UID: "2401beb9-38b8-4581-b9a2-8bb16e15e6c1"). InnerVolumeSpecName "kube-api-access-248ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.278599 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2401beb9-38b8-4581-b9a2-8bb16e15e6c1" (UID: "2401beb9-38b8-4581-b9a2-8bb16e15e6c1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.292410 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-config-data" (OuterVolumeSpecName: "config-data") pod "2401beb9-38b8-4581-b9a2-8bb16e15e6c1" (UID: "2401beb9-38b8-4581-b9a2-8bb16e15e6c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.298600 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2401beb9-38b8-4581-b9a2-8bb16e15e6c1" (UID: "2401beb9-38b8-4581-b9a2-8bb16e15e6c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.303778 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-scripts" (OuterVolumeSpecName: "scripts") pod "2401beb9-38b8-4581-b9a2-8bb16e15e6c1" (UID: "2401beb9-38b8-4581-b9a2-8bb16e15e6c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.314624 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "2401beb9-38b8-4581-b9a2-8bb16e15e6c1" (UID: "2401beb9-38b8-4581-b9a2-8bb16e15e6c1"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.375234 4475 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.375265 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.375274 4475 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.375298 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-248ht\" (UniqueName: \"kubernetes.io/projected/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-kube-api-access-248ht\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.375309 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.375317 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.375324 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2401beb9-38b8-4581-b9a2-8bb16e15e6c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.379900 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" event={"ID":"4ad132bf-e35e-48e6-b406-0aaca969a684","Type":"ContainerDied","Data":"7f4b4c6c3e14d9a927c52851ba5d7f4d55eb209908063413fe149e8543155e51"} Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.379995 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f4b4c6c3e14d9a927c52851ba5d7f4d55eb209908063413fe149e8543155e51" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.380105 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6ad2-account-create-update-phxm5" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.388090 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t8lwx" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.388092 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t8lwx" event={"ID":"fae26675-9566-4902-af26-0247c3f6164b","Type":"ContainerDied","Data":"85370ea0754b0be973aeffcc9728e3a53440e37e3fc0f09cded5175925662729"} Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.388130 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85370ea0754b0be973aeffcc9728e3a53440e37e3fc0f09cded5175925662729" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.389504 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbtn5" event={"ID":"ece011ea-6da3-49ca-8dd8-b014f2796157","Type":"ContainerDied","Data":"487d13d0046aa7141ec10d78c2a992ae1014469d415d91effd362b4c9026717b"} Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.389541 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="487d13d0046aa7141ec10d78c2a992ae1014469d415d91effd362b4c9026717b" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.389580 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbtn5" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.391819 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13335c23-2f7e-4387-9a60-1ba04b5fd219","Type":"ContainerStarted","Data":"d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02"} Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.393618 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-l6t9x" event={"ID":"d5b7dc01-eee2-4b3a-b834-3092016603d3","Type":"ContainerDied","Data":"fcf179ab62a188e053c62c6bb1491e14b8b097e403b18a7c25740c633cf01bed"} Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.393707 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcf179ab62a188e053c62c6bb1491e14b8b097e403b18a7c25740c633cf01bed" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.393629 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l6t9x" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.396518 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" event={"ID":"161d281f-1845-400c-bbda-691d6681cc69","Type":"ContainerDied","Data":"ad5ecc0de0676056789de81fd53d0087f886caee878d9ba431b1e7787e690f84"} Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.396568 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad5ecc0de0676056789de81fd53d0087f886caee878d9ba431b1e7787e690f84" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.396604 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4b91-account-create-update-rv6lt" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.399919 4475 generic.go:334] "Generic (PLEG): container finished" podID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerID="73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02" exitCode=137 Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.399992 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fc4d79b88-s8hhg" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.399989 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fc4d79b88-s8hhg" event={"ID":"2401beb9-38b8-4581-b9a2-8bb16e15e6c1","Type":"ContainerDied","Data":"73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02"} Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.400325 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fc4d79b88-s8hhg" event={"ID":"2401beb9-38b8-4581-b9a2-8bb16e15e6c1","Type":"ContainerDied","Data":"9e9d970ff8e874ce94e3c25e67e9d19e801b3a9b04bd0023ba3c2afbbc072fa6"} Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.400398 4475 scope.go:117] "RemoveContainer" containerID="7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.402284 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7af4-account-create-update-mpdn4" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.403694 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7af4-account-create-update-mpdn4" event={"ID":"37cea503-9fad-48b6-9ec0-b8957e5420f1","Type":"ContainerDied","Data":"97e05fed1b7456228f73b3917c4ab42512d58417ddad2ba9f235f6e6a95293bc"} Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.403905 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97e05fed1b7456228f73b3917c4ab42512d58417ddad2ba9f235f6e6a95293bc" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.545395 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7fc4d79b88-s8hhg"] Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.552014 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7fc4d79b88-s8hhg"] Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.580645 4475 scope.go:117] "RemoveContainer" containerID="73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.668062 4475 scope.go:117] "RemoveContainer" containerID="7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f" Dec 03 07:01:32 crc kubenswrapper[4475]: E1203 07:01:32.668676 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f\": container with ID starting with 7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f not found: ID does not exist" containerID="7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.668703 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f"} err="failed to get container status \"7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f\": rpc error: code = NotFound desc = could not find container \"7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f\": container with ID starting with 7c91bf42ad95717e038e9fab87109b6eaa62ab56bd4ac92990e6946b07c3fc2f not found: ID does not exist" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.668723 4475 scope.go:117] "RemoveContainer" containerID="73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02" Dec 03 07:01:32 crc kubenswrapper[4475]: E1203 07:01:32.669024 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02\": container with ID starting with 73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02 not found: ID does not exist" containerID="73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02" Dec 03 07:01:32 crc kubenswrapper[4475]: I1203 07:01:32.669061 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02"} err="failed to get container status \"73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02\": rpc error: code = NotFound desc = could not find container \"73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02\": container with ID starting with 73f4e1b7c5c20603207f1f81c578b3f50cf378e84d17d25eb6f5c11f11d4ea02 not found: ID does not exist" Dec 03 07:01:33 crc kubenswrapper[4475]: I1203 07:01:33.412085 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13335c23-2f7e-4387-9a60-1ba04b5fd219","Type":"ContainerStarted","Data":"a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6"} Dec 03 07:01:33 crc kubenswrapper[4475]: I1203 07:01:33.499952 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" path="/var/lib/kubelet/pods/2401beb9-38b8-4581-b9a2-8bb16e15e6c1/volumes" Dec 03 07:01:33 crc kubenswrapper[4475]: I1203 07:01:33.500727 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:34 crc kubenswrapper[4475]: I1203 07:01:34.422718 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13335c23-2f7e-4387-9a60-1ba04b5fd219","Type":"ContainerStarted","Data":"6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b"} Dec 03 07:01:34 crc kubenswrapper[4475]: I1203 07:01:34.422891 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="proxy-httpd" containerID="cri-o://6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b" gracePeriod=30 Dec 03 07:01:34 crc kubenswrapper[4475]: I1203 07:01:34.422908 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:01:34 crc kubenswrapper[4475]: I1203 07:01:34.422987 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="ceilometer-central-agent" containerID="cri-o://0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211" gracePeriod=30 Dec 03 07:01:34 crc kubenswrapper[4475]: I1203 07:01:34.423004 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="sg-core" containerID="cri-o://a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6" gracePeriod=30 Dec 03 07:01:34 crc kubenswrapper[4475]: I1203 07:01:34.423048 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="ceilometer-notification-agent" containerID="cri-o://d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02" gracePeriod=30 Dec 03 07:01:34 crc kubenswrapper[4475]: I1203 07:01:34.622441 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:34 crc kubenswrapper[4475]: I1203 07:01:34.625927 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-88869455c-74p7r" Dec 03 07:01:34 crc kubenswrapper[4475]: I1203 07:01:34.653469 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.666643258 podStartE2EDuration="7.653438475s" podCreationTimestamp="2025-12-03 07:01:27 +0000 UTC" firstStartedPulling="2025-12-03 07:01:28.905051272 +0000 UTC m=+973.709949605" lastFinishedPulling="2025-12-03 07:01:33.891846488 +0000 UTC m=+978.696744822" observedRunningTime="2025-12-03 07:01:34.449406902 +0000 UTC m=+979.254305237" watchObservedRunningTime="2025-12-03 07:01:34.653438475 +0000 UTC m=+979.458336808" Dec 03 07:01:35 crc kubenswrapper[4475]: I1203 07:01:35.432982 4475 generic.go:334] "Generic (PLEG): container finished" podID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerID="6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b" exitCode=0 Dec 03 07:01:35 crc kubenswrapper[4475]: I1203 07:01:35.433186 4475 generic.go:334] "Generic (PLEG): container finished" podID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerID="a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6" exitCode=2 Dec 03 07:01:35 crc kubenswrapper[4475]: I1203 07:01:35.433194 4475 generic.go:334] "Generic (PLEG): container finished" podID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerID="d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02" exitCode=0 Dec 03 07:01:35 crc kubenswrapper[4475]: I1203 07:01:35.433761 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13335c23-2f7e-4387-9a60-1ba04b5fd219","Type":"ContainerDied","Data":"6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b"} Dec 03 07:01:35 crc kubenswrapper[4475]: I1203 07:01:35.433786 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13335c23-2f7e-4387-9a60-1ba04b5fd219","Type":"ContainerDied","Data":"a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6"} Dec 03 07:01:35 crc kubenswrapper[4475]: I1203 07:01:35.433796 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13335c23-2f7e-4387-9a60-1ba04b5fd219","Type":"ContainerDied","Data":"d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02"} Dec 03 07:01:35 crc kubenswrapper[4475]: I1203 07:01:35.923507 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-58d4897ff7-tjzk8" podUID="6b5cf0ec-4d62-4de9-ad66-f758113443e7" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.173:8000/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.050861 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5869676b4b-mfgq2" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.088082 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-74bbb49977-5dq74"] Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.196118 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-6c87685856-7spht" podUID="c61afd1c-48a7-47f3-b27d-6c6b56e4ce86" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.175:8004/healthcheck\": dial tcp 10.217.0.175:8004: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.401847 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-8465d48f48-kttgq" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.453623 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7d7bff4588-wzmxn"] Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.520065 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-74bbb49977-5dq74" event={"ID":"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d","Type":"ContainerDied","Data":"c76c2c59dd30460e463bfa5c684554f29958402a2793e148c47355f7f48c801d"} Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.520299 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c76c2c59dd30460e463bfa5c684554f29958402a2793e148c47355f7f48c801d" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.520410 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.692666 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data-custom\") pod \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.693051 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-combined-ca-bundle\") pod \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.693104 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data\") pod \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.693205 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdv2b\" (UniqueName: \"kubernetes.io/projected/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-kube-api-access-cdv2b\") pod \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\" (UID: \"b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d\") " Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.702701 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-kube-api-access-cdv2b" (OuterVolumeSpecName: "kube-api-access-cdv2b") pod "b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" (UID: "b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d"). InnerVolumeSpecName "kube-api-access-cdv2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.707542 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" (UID: "b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.810175 4475 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.810206 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdv2b\" (UniqueName: \"kubernetes.io/projected/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-kube-api-access-cdv2b\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.813760 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data" (OuterVolumeSpecName: "config-data") pod "b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" (UID: "b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.819073 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" (UID: "b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.833854 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.912076 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data-custom\") pod \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.912124 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-combined-ca-bundle\") pod \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.912280 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhg57\" (UniqueName: \"kubernetes.io/projected/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-kube-api-access-dhg57\") pod \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.912342 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data\") pod \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\" (UID: \"88fc5e9a-b3f3-421c-9b93-939e11a9f81f\") " Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.912824 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.912841 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.915226 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "88fc5e9a-b3f3-421c-9b93-939e11a9f81f" (UID: "88fc5e9a-b3f3-421c-9b93-939e11a9f81f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.916885 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-kube-api-access-dhg57" (OuterVolumeSpecName: "kube-api-access-dhg57") pod "88fc5e9a-b3f3-421c-9b93-939e11a9f81f" (UID: "88fc5e9a-b3f3-421c-9b93-939e11a9f81f"). InnerVolumeSpecName "kube-api-access-dhg57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.935058 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88fc5e9a-b3f3-421c-9b93-939e11a9f81f" (UID: "88fc5e9a-b3f3-421c-9b93-939e11a9f81f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:36 crc kubenswrapper[4475]: I1203 07:01:36.958398 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data" (OuterVolumeSpecName: "config-data") pod "88fc5e9a-b3f3-421c-9b93-939e11a9f81f" (UID: "88fc5e9a-b3f3-421c-9b93-939e11a9f81f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.014595 4475 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.014627 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.014646 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhg57\" (UniqueName: \"kubernetes.io/projected/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-kube-api-access-dhg57\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.014658 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88fc5e9a-b3f3-421c-9b93-939e11a9f81f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.057850 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-65549d6dc8-kp5n5" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.100373 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-74fdff459b-tj7xb"] Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.100610 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-74fdff459b-tj7xb" podUID="47298c15-f6d2-463f-a97b-1b4d63999b81" containerName="heat-engine" containerID="cri-o://0453b3dcee79a0359c57a479abaf3ac45f6a7778198ee6091769d53243ac05d2" gracePeriod=60 Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.472964 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-m4crf"] Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473320 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad132bf-e35e-48e6-b406-0aaca969a684" containerName="mariadb-account-create-update" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473333 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad132bf-e35e-48e6-b406-0aaca969a684" containerName="mariadb-account-create-update" Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473354 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae26675-9566-4902-af26-0247c3f6164b" containerName="mariadb-database-create" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473360 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae26675-9566-4902-af26-0247c3f6164b" containerName="mariadb-database-create" Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473370 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37cea503-9fad-48b6-9ec0-b8957e5420f1" containerName="mariadb-account-create-update" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473375 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="37cea503-9fad-48b6-9ec0-b8957e5420f1" containerName="mariadb-account-create-update" Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473383 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473389 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon" Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473399 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon-log" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473405 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon-log" Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473412 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" containerName="heat-api" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473418 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" containerName="heat-api" Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473425 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" containerName="heat-api" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473430 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" containerName="heat-api" Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473437 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161d281f-1845-400c-bbda-691d6681cc69" containerName="mariadb-account-create-update" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473442 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="161d281f-1845-400c-bbda-691d6681cc69" containerName="mariadb-account-create-update" Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473465 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" containerName="heat-cfnapi" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473471 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" containerName="heat-cfnapi" Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473482 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b7dc01-eee2-4b3a-b834-3092016603d3" containerName="mariadb-database-create" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473488 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b7dc01-eee2-4b3a-b834-3092016603d3" containerName="mariadb-database-create" Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473495 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" containerName="heat-cfnapi" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473501 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" containerName="heat-cfnapi" Dec 03 07:01:37 crc kubenswrapper[4475]: E1203 07:01:37.473511 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece011ea-6da3-49ca-8dd8-b014f2796157" containerName="mariadb-database-create" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473516 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece011ea-6da3-49ca-8dd8-b014f2796157" containerName="mariadb-database-create" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473690 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="37cea503-9fad-48b6-9ec0-b8957e5420f1" containerName="mariadb-account-create-update" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473705 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" containerName="heat-cfnapi" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473713 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b7dc01-eee2-4b3a-b834-3092016603d3" containerName="mariadb-database-create" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473724 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae26675-9566-4902-af26-0247c3f6164b" containerName="mariadb-database-create" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473732 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="161d281f-1845-400c-bbda-691d6681cc69" containerName="mariadb-account-create-update" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473742 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473752 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" containerName="heat-api" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473759 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="2401beb9-38b8-4581-b9a2-8bb16e15e6c1" containerName="horizon-log" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473767 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad132bf-e35e-48e6-b406-0aaca969a684" containerName="mariadb-account-create-update" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473776 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece011ea-6da3-49ca-8dd8-b014f2796157" containerName="mariadb-database-create" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.473785 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" containerName="heat-api" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.474337 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.478340 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.478680 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gpl6m" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.478738 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.508895 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-m4crf"] Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.530739 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.530807 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-config-data\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.530863 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn4lz\" (UniqueName: \"kubernetes.io/projected/4afae9fe-ad3f-48d5-a095-9474568f956c-kube-api-access-jn4lz\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.530936 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-scripts\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.539082 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-74bbb49977-5dq74" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.539119 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.539170 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7d7bff4588-wzmxn" event={"ID":"88fc5e9a-b3f3-421c-9b93-939e11a9f81f","Type":"ContainerDied","Data":"3410391fa4ebdc50afa0c5a122db210fe0a99a2fd7d39766e2a76ac7b1b781f8"} Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.539208 4475 scope.go:117] "RemoveContainer" containerID="838d039cf647c8f22883f9643219fa6c5e9de457dc722ec2ba0e17526c6ff11f" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.570172 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-74bbb49977-5dq74"] Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.581200 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-74bbb49977-5dq74"] Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.604052 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7d7bff4588-wzmxn"] Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.610526 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-7d7bff4588-wzmxn"] Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.633704 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-scripts\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.633794 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.633863 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-config-data\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.633937 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn4lz\" (UniqueName: \"kubernetes.io/projected/4afae9fe-ad3f-48d5-a095-9474568f956c-kube-api-access-jn4lz\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.642536 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-config-data\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.657926 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.658715 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-scripts\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.660918 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn4lz\" (UniqueName: \"kubernetes.io/projected/4afae9fe-ad3f-48d5-a095-9474568f956c-kube-api-access-jn4lz\") pod \"nova-cell0-conductor-db-sync-m4crf\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:37 crc kubenswrapper[4475]: I1203 07:01:37.791723 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:01:38 crc kubenswrapper[4475]: I1203 07:01:38.298883 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-m4crf"] Dec 03 07:01:38 crc kubenswrapper[4475]: W1203 07:01:38.304492 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4afae9fe_ad3f_48d5_a095_9474568f956c.slice/crio-4e3869b449c0f2a03cab91aaff4db98b404a0acd4977292b76bd78429d5a97e3 WatchSource:0}: Error finding container 4e3869b449c0f2a03cab91aaff4db98b404a0acd4977292b76bd78429d5a97e3: Status 404 returned error can't find the container with id 4e3869b449c0f2a03cab91aaff4db98b404a0acd4977292b76bd78429d5a97e3 Dec 03 07:01:38 crc kubenswrapper[4475]: I1203 07:01:38.547386 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-m4crf" event={"ID":"4afae9fe-ad3f-48d5-a095-9474568f956c","Type":"ContainerStarted","Data":"4e3869b449c0f2a03cab91aaff4db98b404a0acd4977292b76bd78429d5a97e3"} Dec 03 07:01:39 crc kubenswrapper[4475]: I1203 07:01:39.504631 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" path="/var/lib/kubelet/pods/88fc5e9a-b3f3-421c-9b93-939e11a9f81f/volumes" Dec 03 07:01:39 crc kubenswrapper[4475]: I1203 07:01:39.505899 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d" path="/var/lib/kubelet/pods/b58fd1e8-c5b6-4a83-b9b0-15c6eb22495d/volumes" Dec 03 07:01:40 crc kubenswrapper[4475]: I1203 07:01:40.487867 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 07:01:40 crc kubenswrapper[4475]: E1203 07:01:40.892542 4475 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0453b3dcee79a0359c57a479abaf3ac45f6a7778198ee6091769d53243ac05d2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 07:01:40 crc kubenswrapper[4475]: E1203 07:01:40.903994 4475 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0453b3dcee79a0359c57a479abaf3ac45f6a7778198ee6091769d53243ac05d2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 07:01:40 crc kubenswrapper[4475]: E1203 07:01:40.910543 4475 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0453b3dcee79a0359c57a479abaf3ac45f6a7778198ee6091769d53243ac05d2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 07:01:40 crc kubenswrapper[4475]: E1203 07:01:40.910615 4475 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-74fdff459b-tj7xb" podUID="47298c15-f6d2-463f-a97b-1b4d63999b81" containerName="heat-engine" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.590000 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.624049 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-config-data\") pod \"13335c23-2f7e-4387-9a60-1ba04b5fd219\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.624158 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-run-httpd\") pod \"13335c23-2f7e-4387-9a60-1ba04b5fd219\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.624187 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-scripts\") pod \"13335c23-2f7e-4387-9a60-1ba04b5fd219\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.624411 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zdcb\" (UniqueName: \"kubernetes.io/projected/13335c23-2f7e-4387-9a60-1ba04b5fd219-kube-api-access-8zdcb\") pod \"13335c23-2f7e-4387-9a60-1ba04b5fd219\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.624502 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-log-httpd\") pod \"13335c23-2f7e-4387-9a60-1ba04b5fd219\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.624535 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-sg-core-conf-yaml\") pod \"13335c23-2f7e-4387-9a60-1ba04b5fd219\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.624553 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-combined-ca-bundle\") pod \"13335c23-2f7e-4387-9a60-1ba04b5fd219\" (UID: \"13335c23-2f7e-4387-9a60-1ba04b5fd219\") " Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.629526 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "13335c23-2f7e-4387-9a60-1ba04b5fd219" (UID: "13335c23-2f7e-4387-9a60-1ba04b5fd219"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.629830 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "13335c23-2f7e-4387-9a60-1ba04b5fd219" (UID: "13335c23-2f7e-4387-9a60-1ba04b5fd219"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.646416 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13335c23-2f7e-4387-9a60-1ba04b5fd219-kube-api-access-8zdcb" (OuterVolumeSpecName: "kube-api-access-8zdcb") pod "13335c23-2f7e-4387-9a60-1ba04b5fd219" (UID: "13335c23-2f7e-4387-9a60-1ba04b5fd219"). InnerVolumeSpecName "kube-api-access-8zdcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.646469 4475 generic.go:334] "Generic (PLEG): container finished" podID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerID="0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211" exitCode=0 Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.646506 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13335c23-2f7e-4387-9a60-1ba04b5fd219","Type":"ContainerDied","Data":"0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211"} Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.646535 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13335c23-2f7e-4387-9a60-1ba04b5fd219","Type":"ContainerDied","Data":"9a32645ce2929d9428bfdf88a86c2d3660d71b8ba69725a2a4f0b023defe05a4"} Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.646561 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.646571 4475 scope.go:117] "RemoveContainer" containerID="6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.661577 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-scripts" (OuterVolumeSpecName: "scripts") pod "13335c23-2f7e-4387-9a60-1ba04b5fd219" (UID: "13335c23-2f7e-4387-9a60-1ba04b5fd219"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.689548 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "13335c23-2f7e-4387-9a60-1ba04b5fd219" (UID: "13335c23-2f7e-4387-9a60-1ba04b5fd219"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.726658 4475 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.726678 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.726687 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zdcb\" (UniqueName: \"kubernetes.io/projected/13335c23-2f7e-4387-9a60-1ba04b5fd219-kube-api-access-8zdcb\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.726698 4475 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13335c23-2f7e-4387-9a60-1ba04b5fd219-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.726710 4475 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.761069 4475 scope.go:117] "RemoveContainer" containerID="a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.764030 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-config-data" (OuterVolumeSpecName: "config-data") pod "13335c23-2f7e-4387-9a60-1ba04b5fd219" (UID: "13335c23-2f7e-4387-9a60-1ba04b5fd219"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.764071 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13335c23-2f7e-4387-9a60-1ba04b5fd219" (UID: "13335c23-2f7e-4387-9a60-1ba04b5fd219"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.781824 4475 scope.go:117] "RemoveContainer" containerID="d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.813799 4475 scope.go:117] "RemoveContainer" containerID="0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.835881 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.835932 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13335c23-2f7e-4387-9a60-1ba04b5fd219-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.854598 4475 scope.go:117] "RemoveContainer" containerID="6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b" Dec 03 07:01:41 crc kubenswrapper[4475]: E1203 07:01:41.854983 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b\": container with ID starting with 6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b not found: ID does not exist" containerID="6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.855029 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b"} err="failed to get container status \"6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b\": rpc error: code = NotFound desc = could not find container \"6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b\": container with ID starting with 6cd554168b07d9190a7a1102cfcc2ac9ffdaecd4fbbb25b98475d746ebf7252b not found: ID does not exist" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.855084 4475 scope.go:117] "RemoveContainer" containerID="a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6" Dec 03 07:01:41 crc kubenswrapper[4475]: E1203 07:01:41.856741 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6\": container with ID starting with a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6 not found: ID does not exist" containerID="a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.856788 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6"} err="failed to get container status \"a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6\": rpc error: code = NotFound desc = could not find container \"a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6\": container with ID starting with a76f2d2883d48181850cea17bf8459547abe1c864d9384e682905f3aa5146ea6 not found: ID does not exist" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.856824 4475 scope.go:117] "RemoveContainer" containerID="d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02" Dec 03 07:01:41 crc kubenswrapper[4475]: E1203 07:01:41.857635 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02\": container with ID starting with d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02 not found: ID does not exist" containerID="d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.857675 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02"} err="failed to get container status \"d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02\": rpc error: code = NotFound desc = could not find container \"d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02\": container with ID starting with d9e258fd10c1f6ee451cf5ca4e36bdb98b131bfc7e12ebf3513c0aca0a8f7a02 not found: ID does not exist" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.857705 4475 scope.go:117] "RemoveContainer" containerID="0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211" Dec 03 07:01:41 crc kubenswrapper[4475]: E1203 07:01:41.858535 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211\": container with ID starting with 0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211 not found: ID does not exist" containerID="0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.858565 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211"} err="failed to get container status \"0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211\": rpc error: code = NotFound desc = could not find container \"0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211\": container with ID starting with 0c5c7e67a2661b6d65ea6d4a1427e12fb982eafc0a00337d7b391a6845076211 not found: ID does not exist" Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.979124 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:41 crc kubenswrapper[4475]: I1203 07:01:41.987294 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.003981 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:42 crc kubenswrapper[4475]: E1203 07:01:42.004317 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="ceilometer-central-agent" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.004335 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="ceilometer-central-agent" Dec 03 07:01:42 crc kubenswrapper[4475]: E1203 07:01:42.004346 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="sg-core" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.004354 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="sg-core" Dec 03 07:01:42 crc kubenswrapper[4475]: E1203 07:01:42.004377 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="proxy-httpd" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.004382 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="proxy-httpd" Dec 03 07:01:42 crc kubenswrapper[4475]: E1203 07:01:42.004391 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="ceilometer-notification-agent" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.004396 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="ceilometer-notification-agent" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.004576 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="ceilometer-notification-agent" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.004592 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="sg-core" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.004606 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="proxy-httpd" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.004615 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" containerName="ceilometer-central-agent" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.004623 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="88fc5e9a-b3f3-421c-9b93-939e11a9f81f" containerName="heat-cfnapi" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.005979 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.008407 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.012954 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.039437 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-config-data\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.039495 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-scripts\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.039560 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww6hs\" (UniqueName: \"kubernetes.io/projected/b39bd21a-0e74-4e03-8c62-214faa26b85a-kube-api-access-ww6hs\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.039599 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.039614 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-log-httpd\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.039647 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-run-httpd\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.039677 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.044170 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.141930 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww6hs\" (UniqueName: \"kubernetes.io/projected/b39bd21a-0e74-4e03-8c62-214faa26b85a-kube-api-access-ww6hs\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.142025 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.142061 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-log-httpd\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.142090 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-run-httpd\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.142135 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.142159 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-config-data\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.142185 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-scripts\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.143086 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-run-httpd\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.143405 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-log-httpd\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.148159 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.148910 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.149296 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-config-data\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.149792 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-scripts\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.160305 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww6hs\" (UniqueName: \"kubernetes.io/projected/b39bd21a-0e74-4e03-8c62-214faa26b85a-kube-api-access-ww6hs\") pod \"ceilometer-0\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.322767 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:42 crc kubenswrapper[4475]: I1203 07:01:42.854335 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:43 crc kubenswrapper[4475]: I1203 07:01:43.504613 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13335c23-2f7e-4387-9a60-1ba04b5fd219" path="/var/lib/kubelet/pods/13335c23-2f7e-4387-9a60-1ba04b5fd219/volumes" Dec 03 07:01:43 crc kubenswrapper[4475]: I1203 07:01:43.696729 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39bd21a-0e74-4e03-8c62-214faa26b85a","Type":"ContainerStarted","Data":"a130092ccf0a9401af5f6c02e9864b1932baf61f37eb6621eb7b7c49ced2388b"} Dec 03 07:01:44 crc kubenswrapper[4475]: I1203 07:01:44.725583 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39bd21a-0e74-4e03-8c62-214faa26b85a","Type":"ContainerStarted","Data":"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6"} Dec 03 07:01:45 crc kubenswrapper[4475]: I1203 07:01:45.745540 4475 generic.go:334] "Generic (PLEG): container finished" podID="47298c15-f6d2-463f-a97b-1b4d63999b81" containerID="0453b3dcee79a0359c57a479abaf3ac45f6a7778198ee6091769d53243ac05d2" exitCode=0 Dec 03 07:01:45 crc kubenswrapper[4475]: I1203 07:01:45.745624 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-74fdff459b-tj7xb" event={"ID":"47298c15-f6d2-463f-a97b-1b4d63999b81","Type":"ContainerDied","Data":"0453b3dcee79a0359c57a479abaf3ac45f6a7778198ee6091769d53243ac05d2"} Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.515535 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.620476 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-combined-ca-bundle\") pod \"47298c15-f6d2-463f-a97b-1b4d63999b81\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.620553 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data-custom\") pod \"47298c15-f6d2-463f-a97b-1b4d63999b81\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.620705 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s97h\" (UniqueName: \"kubernetes.io/projected/47298c15-f6d2-463f-a97b-1b4d63999b81-kube-api-access-9s97h\") pod \"47298c15-f6d2-463f-a97b-1b4d63999b81\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.620745 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data\") pod \"47298c15-f6d2-463f-a97b-1b4d63999b81\" (UID: \"47298c15-f6d2-463f-a97b-1b4d63999b81\") " Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.663065 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47298c15-f6d2-463f-a97b-1b4d63999b81-kube-api-access-9s97h" (OuterVolumeSpecName: "kube-api-access-9s97h") pod "47298c15-f6d2-463f-a97b-1b4d63999b81" (UID: "47298c15-f6d2-463f-a97b-1b4d63999b81"). InnerVolumeSpecName "kube-api-access-9s97h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.667666 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "47298c15-f6d2-463f-a97b-1b4d63999b81" (UID: "47298c15-f6d2-463f-a97b-1b4d63999b81"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.722826 4475 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.722860 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s97h\" (UniqueName: \"kubernetes.io/projected/47298c15-f6d2-463f-a97b-1b4d63999b81-kube-api-access-9s97h\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.731801 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47298c15-f6d2-463f-a97b-1b4d63999b81" (UID: "47298c15-f6d2-463f-a97b-1b4d63999b81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.774860 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data" (OuterVolumeSpecName: "config-data") pod "47298c15-f6d2-463f-a97b-1b4d63999b81" (UID: "47298c15-f6d2-463f-a97b-1b4d63999b81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.803968 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39bd21a-0e74-4e03-8c62-214faa26b85a","Type":"ContainerStarted","Data":"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6"} Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.806495 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-74fdff459b-tj7xb" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.806654 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-74fdff459b-tj7xb" event={"ID":"47298c15-f6d2-463f-a97b-1b4d63999b81","Type":"ContainerDied","Data":"c1ff6f2a7eab492e3539bc718bc309dd4305fcda6810a673ddf4d273bbc60c89"} Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.806803 4475 scope.go:117] "RemoveContainer" containerID="0453b3dcee79a0359c57a479abaf3ac45f6a7778198ee6091769d53243ac05d2" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.811962 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-m4crf" event={"ID":"4afae9fe-ad3f-48d5-a095-9474568f956c","Type":"ContainerStarted","Data":"507404adf0993a3824029651fe87cc72be1488576a6bd17fdafdb492ac344d85"} Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.824680 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.824708 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47298c15-f6d2-463f-a97b-1b4d63999b81-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.837010 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-m4crf" podStartSLOduration=1.828160687 podStartE2EDuration="12.836993834s" podCreationTimestamp="2025-12-03 07:01:37 +0000 UTC" firstStartedPulling="2025-12-03 07:01:38.307842626 +0000 UTC m=+983.112740960" lastFinishedPulling="2025-12-03 07:01:49.316675774 +0000 UTC m=+994.121574107" observedRunningTime="2025-12-03 07:01:49.833750753 +0000 UTC m=+994.638649088" watchObservedRunningTime="2025-12-03 07:01:49.836993834 +0000 UTC m=+994.641892158" Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.857735 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-74fdff459b-tj7xb"] Dec 03 07:01:49 crc kubenswrapper[4475]: I1203 07:01:49.866635 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-74fdff459b-tj7xb"] Dec 03 07:01:50 crc kubenswrapper[4475]: I1203 07:01:50.826000 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39bd21a-0e74-4e03-8c62-214faa26b85a","Type":"ContainerStarted","Data":"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc"} Dec 03 07:01:51 crc kubenswrapper[4475]: I1203 07:01:51.501874 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47298c15-f6d2-463f-a97b-1b4d63999b81" path="/var/lib/kubelet/pods/47298c15-f6d2-463f-a97b-1b4d63999b81/volumes" Dec 03 07:01:51 crc kubenswrapper[4475]: I1203 07:01:51.617143 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:52 crc kubenswrapper[4475]: I1203 07:01:52.844696 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39bd21a-0e74-4e03-8c62-214faa26b85a","Type":"ContainerStarted","Data":"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205"} Dec 03 07:01:52 crc kubenswrapper[4475]: I1203 07:01:52.844859 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="ceilometer-central-agent" containerID="cri-o://42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6" gracePeriod=30 Dec 03 07:01:52 crc kubenswrapper[4475]: I1203 07:01:52.844930 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="sg-core" containerID="cri-o://b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc" gracePeriod=30 Dec 03 07:01:52 crc kubenswrapper[4475]: I1203 07:01:52.844941 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="ceilometer-notification-agent" containerID="cri-o://fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6" gracePeriod=30 Dec 03 07:01:52 crc kubenswrapper[4475]: I1203 07:01:52.845155 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:01:52 crc kubenswrapper[4475]: I1203 07:01:52.845262 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="proxy-httpd" containerID="cri-o://e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205" gracePeriod=30 Dec 03 07:01:52 crc kubenswrapper[4475]: I1203 07:01:52.871165 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.008347311 podStartE2EDuration="11.871149902s" podCreationTimestamp="2025-12-03 07:01:41 +0000 UTC" firstStartedPulling="2025-12-03 07:01:42.876466744 +0000 UTC m=+987.681365078" lastFinishedPulling="2025-12-03 07:01:51.739269335 +0000 UTC m=+996.544167669" observedRunningTime="2025-12-03 07:01:52.870789484 +0000 UTC m=+997.675687818" watchObservedRunningTime="2025-12-03 07:01:52.871149902 +0000 UTC m=+997.676048236" Dec 03 07:01:52 crc kubenswrapper[4475]: I1203 07:01:52.950344 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:01:52 crc kubenswrapper[4475]: I1203 07:01:52.950662 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5d024ccd-9b9c-4656-a6c9-88c6d524960c" containerName="glance-log" containerID="cri-o://0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c" gracePeriod=30 Dec 03 07:01:52 crc kubenswrapper[4475]: I1203 07:01:52.950841 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5d024ccd-9b9c-4656-a6c9-88c6d524960c" containerName="glance-httpd" containerID="cri-o://0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184" gracePeriod=30 Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.636622 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.810503 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-sg-core-conf-yaml\") pod \"b39bd21a-0e74-4e03-8c62-214faa26b85a\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.810614 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-combined-ca-bundle\") pod \"b39bd21a-0e74-4e03-8c62-214faa26b85a\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.810677 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-log-httpd\") pod \"b39bd21a-0e74-4e03-8c62-214faa26b85a\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.810735 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-run-httpd\") pod \"b39bd21a-0e74-4e03-8c62-214faa26b85a\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.810761 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-config-data\") pod \"b39bd21a-0e74-4e03-8c62-214faa26b85a\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.810821 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww6hs\" (UniqueName: \"kubernetes.io/projected/b39bd21a-0e74-4e03-8c62-214faa26b85a-kube-api-access-ww6hs\") pod \"b39bd21a-0e74-4e03-8c62-214faa26b85a\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.810873 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-scripts\") pod \"b39bd21a-0e74-4e03-8c62-214faa26b85a\" (UID: \"b39bd21a-0e74-4e03-8c62-214faa26b85a\") " Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.811303 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b39bd21a-0e74-4e03-8c62-214faa26b85a" (UID: "b39bd21a-0e74-4e03-8c62-214faa26b85a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.811430 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b39bd21a-0e74-4e03-8c62-214faa26b85a" (UID: "b39bd21a-0e74-4e03-8c62-214faa26b85a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.813568 4475 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.814957 4475 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39bd21a-0e74-4e03-8c62-214faa26b85a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.820854 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-scripts" (OuterVolumeSpecName: "scripts") pod "b39bd21a-0e74-4e03-8c62-214faa26b85a" (UID: "b39bd21a-0e74-4e03-8c62-214faa26b85a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.824810 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39bd21a-0e74-4e03-8c62-214faa26b85a-kube-api-access-ww6hs" (OuterVolumeSpecName: "kube-api-access-ww6hs") pod "b39bd21a-0e74-4e03-8c62-214faa26b85a" (UID: "b39bd21a-0e74-4e03-8c62-214faa26b85a"). InnerVolumeSpecName "kube-api-access-ww6hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.840723 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b39bd21a-0e74-4e03-8c62-214faa26b85a" (UID: "b39bd21a-0e74-4e03-8c62-214faa26b85a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.857063 4475 generic.go:334] "Generic (PLEG): container finished" podID="5d024ccd-9b9c-4656-a6c9-88c6d524960c" containerID="0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c" exitCode=143 Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.857134 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d024ccd-9b9c-4656-a6c9-88c6d524960c","Type":"ContainerDied","Data":"0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c"} Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.863067 4475 generic.go:334] "Generic (PLEG): container finished" podID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerID="e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205" exitCode=0 Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.863096 4475 generic.go:334] "Generic (PLEG): container finished" podID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerID="b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc" exitCode=2 Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.863103 4475 generic.go:334] "Generic (PLEG): container finished" podID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerID="fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6" exitCode=0 Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.863113 4475 generic.go:334] "Generic (PLEG): container finished" podID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerID="42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6" exitCode=0 Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.863135 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39bd21a-0e74-4e03-8c62-214faa26b85a","Type":"ContainerDied","Data":"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205"} Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.863164 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39bd21a-0e74-4e03-8c62-214faa26b85a","Type":"ContainerDied","Data":"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc"} Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.863176 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39bd21a-0e74-4e03-8c62-214faa26b85a","Type":"ContainerDied","Data":"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6"} Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.863185 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39bd21a-0e74-4e03-8c62-214faa26b85a","Type":"ContainerDied","Data":"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6"} Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.863194 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39bd21a-0e74-4e03-8c62-214faa26b85a","Type":"ContainerDied","Data":"a130092ccf0a9401af5f6c02e9864b1932baf61f37eb6621eb7b7c49ced2388b"} Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.863211 4475 scope.go:117] "RemoveContainer" containerID="e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.863384 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.877932 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b39bd21a-0e74-4e03-8c62-214faa26b85a" (UID: "b39bd21a-0e74-4e03-8c62-214faa26b85a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.878335 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.878842 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerName="glance-log" containerID="cri-o://4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c" gracePeriod=30 Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.883254 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerName="glance-httpd" containerID="cri-o://9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd" gracePeriod=30 Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.900824 4475 scope.go:117] "RemoveContainer" containerID="b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.919935 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.920323 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww6hs\" (UniqueName: \"kubernetes.io/projected/b39bd21a-0e74-4e03-8c62-214faa26b85a-kube-api-access-ww6hs\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.920339 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.920349 4475 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.938577 4475 scope.go:117] "RemoveContainer" containerID="fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.941719 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-config-data" (OuterVolumeSpecName: "config-data") pod "b39bd21a-0e74-4e03-8c62-214faa26b85a" (UID: "b39bd21a-0e74-4e03-8c62-214faa26b85a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:53 crc kubenswrapper[4475]: I1203 07:01:53.955958 4475 scope.go:117] "RemoveContainer" containerID="42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.015731 4475 scope.go:117] "RemoveContainer" containerID="e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205" Dec 03 07:01:54 crc kubenswrapper[4475]: E1203 07:01:54.016749 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205\": container with ID starting with e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205 not found: ID does not exist" containerID="e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.016788 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205"} err="failed to get container status \"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205\": rpc error: code = NotFound desc = could not find container \"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205\": container with ID starting with e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.016813 4475 scope.go:117] "RemoveContainer" containerID="b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc" Dec 03 07:01:54 crc kubenswrapper[4475]: E1203 07:01:54.017093 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc\": container with ID starting with b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc not found: ID does not exist" containerID="b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.017124 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc"} err="failed to get container status \"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc\": rpc error: code = NotFound desc = could not find container \"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc\": container with ID starting with b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.017149 4475 scope.go:117] "RemoveContainer" containerID="fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6" Dec 03 07:01:54 crc kubenswrapper[4475]: E1203 07:01:54.017354 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6\": container with ID starting with fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6 not found: ID does not exist" containerID="fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.017371 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6"} err="failed to get container status \"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6\": rpc error: code = NotFound desc = could not find container \"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6\": container with ID starting with fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.017386 4475 scope.go:117] "RemoveContainer" containerID="42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6" Dec 03 07:01:54 crc kubenswrapper[4475]: E1203 07:01:54.017733 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6\": container with ID starting with 42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6 not found: ID does not exist" containerID="42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.017748 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6"} err="failed to get container status \"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6\": rpc error: code = NotFound desc = could not find container \"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6\": container with ID starting with 42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.017759 4475 scope.go:117] "RemoveContainer" containerID="e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.019711 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205"} err="failed to get container status \"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205\": rpc error: code = NotFound desc = could not find container \"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205\": container with ID starting with e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.019730 4475 scope.go:117] "RemoveContainer" containerID="b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.019929 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc"} err="failed to get container status \"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc\": rpc error: code = NotFound desc = could not find container \"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc\": container with ID starting with b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.019941 4475 scope.go:117] "RemoveContainer" containerID="fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.021074 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6"} err="failed to get container status \"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6\": rpc error: code = NotFound desc = could not find container \"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6\": container with ID starting with fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.021090 4475 scope.go:117] "RemoveContainer" containerID="42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.021367 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6"} err="failed to get container status \"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6\": rpc error: code = NotFound desc = could not find container \"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6\": container with ID starting with 42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.021399 4475 scope.go:117] "RemoveContainer" containerID="e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.021705 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205"} err="failed to get container status \"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205\": rpc error: code = NotFound desc = could not find container \"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205\": container with ID starting with e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.021720 4475 scope.go:117] "RemoveContainer" containerID="b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.021911 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc"} err="failed to get container status \"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc\": rpc error: code = NotFound desc = could not find container \"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc\": container with ID starting with b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.021926 4475 scope.go:117] "RemoveContainer" containerID="fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.022146 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39bd21a-0e74-4e03-8c62-214faa26b85a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.022169 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6"} err="failed to get container status \"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6\": rpc error: code = NotFound desc = could not find container \"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6\": container with ID starting with fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.022187 4475 scope.go:117] "RemoveContainer" containerID="42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.034635 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6"} err="failed to get container status \"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6\": rpc error: code = NotFound desc = could not find container \"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6\": container with ID starting with 42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.034663 4475 scope.go:117] "RemoveContainer" containerID="e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.037243 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205"} err="failed to get container status \"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205\": rpc error: code = NotFound desc = could not find container \"e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205\": container with ID starting with e20ac65b874526174470c7e18529e773290afeaf9240cf59dbfeb495ac0ba205 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.037278 4475 scope.go:117] "RemoveContainer" containerID="b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.037802 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc"} err="failed to get container status \"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc\": rpc error: code = NotFound desc = could not find container \"b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc\": container with ID starting with b6c83ba02753be28d56090aae412f0c25ea05e79ba76d21682aa7e36cb5738bc not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.037827 4475 scope.go:117] "RemoveContainer" containerID="fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.039895 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6"} err="failed to get container status \"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6\": rpc error: code = NotFound desc = could not find container \"fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6\": container with ID starting with fcba40236a1c80b9d7d6a9713b117259658e4b2d26321445ee4458edad30dfe6 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.039919 4475 scope.go:117] "RemoveContainer" containerID="42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.040214 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6"} err="failed to get container status \"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6\": rpc error: code = NotFound desc = could not find container \"42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6\": container with ID starting with 42010152fab9e9c6884e68c39db7b5d5ecb07756d750566720bf95ee2fdca5a6 not found: ID does not exist" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.194703 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.202025 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.225819 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:54 crc kubenswrapper[4475]: E1203 07:01:54.226170 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="sg-core" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.226188 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="sg-core" Dec 03 07:01:54 crc kubenswrapper[4475]: E1203 07:01:54.226197 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47298c15-f6d2-463f-a97b-1b4d63999b81" containerName="heat-engine" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.226203 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="47298c15-f6d2-463f-a97b-1b4d63999b81" containerName="heat-engine" Dec 03 07:01:54 crc kubenswrapper[4475]: E1203 07:01:54.226226 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="ceilometer-notification-agent" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.226232 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="ceilometer-notification-agent" Dec 03 07:01:54 crc kubenswrapper[4475]: E1203 07:01:54.226241 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="proxy-httpd" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.226246 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="proxy-httpd" Dec 03 07:01:54 crc kubenswrapper[4475]: E1203 07:01:54.226262 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="ceilometer-central-agent" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.226267 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="ceilometer-central-agent" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.226467 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="ceilometer-central-agent" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.226476 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="sg-core" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.226490 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="proxy-httpd" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.226503 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="47298c15-f6d2-463f-a97b-1b4d63999b81" containerName="heat-engine" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.226513 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" containerName="ceilometer-notification-agent" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.228022 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.231844 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.232091 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.240750 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.329073 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-run-httpd\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.329185 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.329263 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-config-data\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.329424 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.329620 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-log-httpd\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.329711 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97gpz\" (UniqueName: \"kubernetes.io/projected/1741312d-351a-4ccd-94ef-54f59b3cf6ef-kube-api-access-97gpz\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.329831 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-scripts\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.431389 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.431471 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-log-httpd\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.431501 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97gpz\" (UniqueName: \"kubernetes.io/projected/1741312d-351a-4ccd-94ef-54f59b3cf6ef-kube-api-access-97gpz\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.431538 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-scripts\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.431553 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-run-httpd\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.431600 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.431637 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-config-data\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.432161 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-log-httpd\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.432270 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-run-httpd\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.435407 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-scripts\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.437930 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.448375 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-config-data\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.448823 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97gpz\" (UniqueName: \"kubernetes.io/projected/1741312d-351a-4ccd-94ef-54f59b3cf6ef-kube-api-access-97gpz\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.452025 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.545876 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.871718 4475 generic.go:334] "Generic (PLEG): container finished" podID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerID="4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c" exitCode=143 Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.871804 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2604ecb1-3054-4b63-9a8d-3880ee519c58","Type":"ContainerDied","Data":"4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c"} Dec 03 07:01:54 crc kubenswrapper[4475]: I1203 07:01:54.997991 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:55 crc kubenswrapper[4475]: I1203 07:01:55.291526 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:01:55 crc kubenswrapper[4475]: I1203 07:01:55.504845 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39bd21a-0e74-4e03-8c62-214faa26b85a" path="/var/lib/kubelet/pods/b39bd21a-0e74-4e03-8c62-214faa26b85a/volumes" Dec 03 07:01:55 crc kubenswrapper[4475]: I1203 07:01:55.885423 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1741312d-351a-4ccd-94ef-54f59b3cf6ef","Type":"ContainerStarted","Data":"8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd"} Dec 03 07:01:55 crc kubenswrapper[4475]: I1203 07:01:55.885755 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1741312d-351a-4ccd-94ef-54f59b3cf6ef","Type":"ContainerStarted","Data":"e0d9a053e4a3a8b7e04fcc752f2fb4d1824140fe6d383ecdbb4ab5858a63cf26"} Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.719908 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.801015 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-config-data\") pod \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.801093 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-httpd-run\") pod \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.801173 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-public-tls-certs\") pod \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.801338 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-combined-ca-bundle\") pod \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.801384 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-logs\") pod \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.801429 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-scripts\") pod \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.801445 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.801500 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt4vd\" (UniqueName: \"kubernetes.io/projected/5d024ccd-9b9c-4656-a6c9-88c6d524960c-kube-api-access-nt4vd\") pod \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\" (UID: \"5d024ccd-9b9c-4656-a6c9-88c6d524960c\") " Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.807617 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d024ccd-9b9c-4656-a6c9-88c6d524960c-kube-api-access-nt4vd" (OuterVolumeSpecName: "kube-api-access-nt4vd") pod "5d024ccd-9b9c-4656-a6c9-88c6d524960c" (UID: "5d024ccd-9b9c-4656-a6c9-88c6d524960c"). InnerVolumeSpecName "kube-api-access-nt4vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.807912 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-logs" (OuterVolumeSpecName: "logs") pod "5d024ccd-9b9c-4656-a6c9-88c6d524960c" (UID: "5d024ccd-9b9c-4656-a6c9-88c6d524960c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.810696 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-scripts" (OuterVolumeSpecName: "scripts") pod "5d024ccd-9b9c-4656-a6c9-88c6d524960c" (UID: "5d024ccd-9b9c-4656-a6c9-88c6d524960c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.812117 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5d024ccd-9b9c-4656-a6c9-88c6d524960c" (UID: "5d024ccd-9b9c-4656-a6c9-88c6d524960c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.813376 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "5d024ccd-9b9c-4656-a6c9-88c6d524960c" (UID: "5d024ccd-9b9c-4656-a6c9-88c6d524960c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.829311 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d024ccd-9b9c-4656-a6c9-88c6d524960c" (UID: "5d024ccd-9b9c-4656-a6c9-88c6d524960c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.875167 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-config-data" (OuterVolumeSpecName: "config-data") pod "5d024ccd-9b9c-4656-a6c9-88c6d524960c" (UID: "5d024ccd-9b9c-4656-a6c9-88c6d524960c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.886575 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d024ccd-9b9c-4656-a6c9-88c6d524960c" (UID: "5d024ccd-9b9c-4656-a6c9-88c6d524960c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.901874 4475 generic.go:334] "Generic (PLEG): container finished" podID="5d024ccd-9b9c-4656-a6c9-88c6d524960c" containerID="0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184" exitCode=0 Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.901930 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d024ccd-9b9c-4656-a6c9-88c6d524960c","Type":"ContainerDied","Data":"0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184"} Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.901956 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d024ccd-9b9c-4656-a6c9-88c6d524960c","Type":"ContainerDied","Data":"e9acde908258dc1084654ed92bb44f89acd97a97f817a787c21b6bab6e21d48e"} Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.901971 4475 scope.go:117] "RemoveContainer" containerID="0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.902091 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.906169 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.906194 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.906204 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.906226 4475 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.906259 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt4vd\" (UniqueName: \"kubernetes.io/projected/5d024ccd-9b9c-4656-a6c9-88c6d524960c-kube-api-access-nt4vd\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.906270 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.906277 4475 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d024ccd-9b9c-4656-a6c9-88c6d524960c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.906285 4475 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d024ccd-9b9c-4656-a6c9-88c6d524960c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.907907 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1741312d-351a-4ccd-94ef-54f59b3cf6ef","Type":"ContainerStarted","Data":"fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924"} Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.931589 4475 scope.go:117] "RemoveContainer" containerID="0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.952554 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.956890 4475 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.962096 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.978687 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:01:56 crc kubenswrapper[4475]: E1203 07:01:56.979008 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d024ccd-9b9c-4656-a6c9-88c6d524960c" containerName="glance-httpd" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.979025 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d024ccd-9b9c-4656-a6c9-88c6d524960c" containerName="glance-httpd" Dec 03 07:01:56 crc kubenswrapper[4475]: E1203 07:01:56.979070 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d024ccd-9b9c-4656-a6c9-88c6d524960c" containerName="glance-log" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.979076 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d024ccd-9b9c-4656-a6c9-88c6d524960c" containerName="glance-log" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.979240 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d024ccd-9b9c-4656-a6c9-88c6d524960c" containerName="glance-log" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.979264 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d024ccd-9b9c-4656-a6c9-88c6d524960c" containerName="glance-httpd" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.985137 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.985990 4475 scope.go:117] "RemoveContainer" containerID="0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184" Dec 03 07:01:56 crc kubenswrapper[4475]: E1203 07:01:56.987272 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184\": container with ID starting with 0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184 not found: ID does not exist" containerID="0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.987331 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184"} err="failed to get container status \"0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184\": rpc error: code = NotFound desc = could not find container \"0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184\": container with ID starting with 0bb900b63b700af9cf8b01415f76d8b98b1d3870a1c25b553f5514be65ff0184 not found: ID does not exist" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.987357 4475 scope.go:117] "RemoveContainer" containerID="0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c" Dec 03 07:01:56 crc kubenswrapper[4475]: E1203 07:01:56.989216 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c\": container with ID starting with 0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c not found: ID does not exist" containerID="0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.989248 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c"} err="failed to get container status \"0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c\": rpc error: code = NotFound desc = could not find container \"0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c\": container with ID starting with 0bbd05fc0dd7e7111a65c6f7a285b3b2875035ee605e1fa046bf17584914bc7c not found: ID does not exist" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.989489 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 07:01:56 crc kubenswrapper[4475]: I1203 07:01:56.989633 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.008714 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.008785 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-scripts\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.008877 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-config-data\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.008953 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkffl\" (UniqueName: \"kubernetes.io/projected/15c86d99-b017-4383-a20d-183a68ea3feb-kube-api-access-zkffl\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.009005 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15c86d99-b017-4383-a20d-183a68ea3feb-logs\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.009045 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.009111 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.009159 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15c86d99-b017-4383-a20d-183a68ea3feb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.009512 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.073075 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.078317 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.110329 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.110394 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15c86d99-b017-4383-a20d-183a68ea3feb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.110442 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-scripts\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.110529 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-config-data\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.110590 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkffl\" (UniqueName: \"kubernetes.io/projected/15c86d99-b017-4383-a20d-183a68ea3feb-kube-api-access-zkffl\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.110626 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15c86d99-b017-4383-a20d-183a68ea3feb-logs\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.110671 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.111394 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15c86d99-b017-4383-a20d-183a68ea3feb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.114239 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15c86d99-b017-4383-a20d-183a68ea3feb-logs\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.120495 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.121898 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-scripts\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.121998 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.122863 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c86d99-b017-4383-a20d-183a68ea3feb-config-data\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.126810 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkffl\" (UniqueName: \"kubernetes.io/projected/15c86d99-b017-4383-a20d-183a68ea3feb-kube-api-access-zkffl\") pod \"glance-default-external-api-0\" (UID: \"15c86d99-b017-4383-a20d-183a68ea3feb\") " pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.149467 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:54400->10.217.0.149:9292: read: connection reset by peer" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.149772 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:54390->10.217.0.149:9292: read: connection reset by peer" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.300487 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.559174 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d024ccd-9b9c-4656-a6c9-88c6d524960c" path="/var/lib/kubelet/pods/5d024ccd-9b9c-4656-a6c9-88c6d524960c/volumes" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.732204 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.919560 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1741312d-351a-4ccd-94ef-54f59b3cf6ef","Type":"ContainerStarted","Data":"8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385"} Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.935957 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-combined-ca-bundle\") pod \"2604ecb1-3054-4b63-9a8d-3880ee519c58\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.935986 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-scripts\") pod \"2604ecb1-3054-4b63-9a8d-3880ee519c58\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.936016 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-config-data\") pod \"2604ecb1-3054-4b63-9a8d-3880ee519c58\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.936079 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-logs\") pod \"2604ecb1-3054-4b63-9a8d-3880ee519c58\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.936098 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"2604ecb1-3054-4b63-9a8d-3880ee519c58\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.936166 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-internal-tls-certs\") pod \"2604ecb1-3054-4b63-9a8d-3880ee519c58\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.936249 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxnd5\" (UniqueName: \"kubernetes.io/projected/2604ecb1-3054-4b63-9a8d-3880ee519c58-kube-api-access-nxnd5\") pod \"2604ecb1-3054-4b63-9a8d-3880ee519c58\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.936293 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-httpd-run\") pod \"2604ecb1-3054-4b63-9a8d-3880ee519c58\" (UID: \"2604ecb1-3054-4b63-9a8d-3880ee519c58\") " Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.936896 4475 generic.go:334] "Generic (PLEG): container finished" podID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerID="9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd" exitCode=0 Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.936933 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2604ecb1-3054-4b63-9a8d-3880ee519c58","Type":"ContainerDied","Data":"9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd"} Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.936988 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2604ecb1-3054-4b63-9a8d-3880ee519c58","Type":"ContainerDied","Data":"fdf05e9ea21bd26a7cc37d8980e2c33353d700d2d33bfde896506aa85d2d16c3"} Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.937006 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2604ecb1-3054-4b63-9a8d-3880ee519c58" (UID: "2604ecb1-3054-4b63-9a8d-3880ee519c58"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.937025 4475 scope.go:117] "RemoveContainer" containerID="9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.937151 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.937906 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-logs" (OuterVolumeSpecName: "logs") pod "2604ecb1-3054-4b63-9a8d-3880ee519c58" (UID: "2604ecb1-3054-4b63-9a8d-3880ee519c58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.949165 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "2604ecb1-3054-4b63-9a8d-3880ee519c58" (UID: "2604ecb1-3054-4b63-9a8d-3880ee519c58"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.954046 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-scripts" (OuterVolumeSpecName: "scripts") pod "2604ecb1-3054-4b63-9a8d-3880ee519c58" (UID: "2604ecb1-3054-4b63-9a8d-3880ee519c58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.954401 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2604ecb1-3054-4b63-9a8d-3880ee519c58-kube-api-access-nxnd5" (OuterVolumeSpecName: "kube-api-access-nxnd5") pod "2604ecb1-3054-4b63-9a8d-3880ee519c58" (UID: "2604ecb1-3054-4b63-9a8d-3880ee519c58"). InnerVolumeSpecName "kube-api-access-nxnd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.979603 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2604ecb1-3054-4b63-9a8d-3880ee519c58" (UID: "2604ecb1-3054-4b63-9a8d-3880ee519c58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:57 crc kubenswrapper[4475]: I1203 07:01:57.981559 4475 scope.go:117] "RemoveContainer" containerID="4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.010392 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-config-data" (OuterVolumeSpecName: "config-data") pod "2604ecb1-3054-4b63-9a8d-3880ee519c58" (UID: "2604ecb1-3054-4b63-9a8d-3880ee519c58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.012744 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.013138 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2604ecb1-3054-4b63-9a8d-3880ee519c58" (UID: "2604ecb1-3054-4b63-9a8d-3880ee519c58"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.019721 4475 scope.go:117] "RemoveContainer" containerID="9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd" Dec 03 07:01:58 crc kubenswrapper[4475]: E1203 07:01:58.020262 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd\": container with ID starting with 9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd not found: ID does not exist" containerID="9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.020327 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd"} err="failed to get container status \"9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd\": rpc error: code = NotFound desc = could not find container \"9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd\": container with ID starting with 9c1311c38911be15a3438d7290ad0bb61f6702e1c24505d9b9e9b114004963cd not found: ID does not exist" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.020350 4475 scope.go:117] "RemoveContainer" containerID="4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c" Dec 03 07:01:58 crc kubenswrapper[4475]: E1203 07:01:58.020679 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c\": container with ID starting with 4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c not found: ID does not exist" containerID="4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.020707 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c"} err="failed to get container status \"4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c\": rpc error: code = NotFound desc = could not find container \"4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c\": container with ID starting with 4afaec11a8cc3ea92bfd0c41412a193645e87900cd38f199ef4aaef2eb86fa3c not found: ID does not exist" Dec 03 07:01:58 crc kubenswrapper[4475]: W1203 07:01:58.026010 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15c86d99_b017_4383_a20d_183a68ea3feb.slice/crio-fa07c81eb02ec9e20bd59ddda7d7f2f1741fd0b4bb8919aac22b62a33f839789 WatchSource:0}: Error finding container fa07c81eb02ec9e20bd59ddda7d7f2f1741fd0b4bb8919aac22b62a33f839789: Status 404 returned error can't find the container with id fa07c81eb02ec9e20bd59ddda7d7f2f1741fd0b4bb8919aac22b62a33f839789 Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.038032 4475 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.038926 4475 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.038953 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxnd5\" (UniqueName: \"kubernetes.io/projected/2604ecb1-3054-4b63-9a8d-3880ee519c58-kube-api-access-nxnd5\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.038966 4475 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.038974 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.038984 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.038992 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2604ecb1-3054-4b63-9a8d-3880ee519c58-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.039000 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2604ecb1-3054-4b63-9a8d-3880ee519c58-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.062809 4475 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.140636 4475 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.277379 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.303878 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.303928 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:01:58 crc kubenswrapper[4475]: E1203 07:01:58.304227 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerName="glance-httpd" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.304244 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerName="glance-httpd" Dec 03 07:01:58 crc kubenswrapper[4475]: E1203 07:01:58.304270 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerName="glance-log" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.304276 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerName="glance-log" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.304420 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerName="glance-log" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.304464 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="2604ecb1-3054-4b63-9a8d-3880ee519c58" containerName="glance-httpd" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.305261 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.308182 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.308356 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.323878 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.449834 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.449914 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01d6f138-2bc3-453d-b892-4feb51001eab-logs\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.449979 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.450020 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01d6f138-2bc3-453d-b892-4feb51001eab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.450137 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.450246 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.450326 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsjk8\" (UniqueName: \"kubernetes.io/projected/01d6f138-2bc3-453d-b892-4feb51001eab-kube-api-access-fsjk8\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.450349 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: E1203 07:01:58.475256 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2604ecb1_3054_4b63_9a8d_3880ee519c58.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2604ecb1_3054_4b63_9a8d_3880ee519c58.slice/crio-fdf05e9ea21bd26a7cc37d8980e2c33353d700d2d33bfde896506aa85d2d16c3\": RecentStats: unable to find data in memory cache]" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.552398 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.552498 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.552536 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsjk8\" (UniqueName: \"kubernetes.io/projected/01d6f138-2bc3-453d-b892-4feb51001eab-kube-api-access-fsjk8\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.552557 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.552600 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.552626 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01d6f138-2bc3-453d-b892-4feb51001eab-logs\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.552652 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.552675 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01d6f138-2bc3-453d-b892-4feb51001eab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.553065 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01d6f138-2bc3-453d-b892-4feb51001eab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.553754 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.557307 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01d6f138-2bc3-453d-b892-4feb51001eab-logs\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.564374 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.564383 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.566991 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.571712 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d6f138-2bc3-453d-b892-4feb51001eab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.572248 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsjk8\" (UniqueName: \"kubernetes.io/projected/01d6f138-2bc3-453d-b892-4feb51001eab-kube-api-access-fsjk8\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.580783 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"01d6f138-2bc3-453d-b892-4feb51001eab\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.620953 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.935410 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.935682 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.962280 4475 generic.go:334] "Generic (PLEG): container finished" podID="4afae9fe-ad3f-48d5-a095-9474568f956c" containerID="507404adf0993a3824029651fe87cc72be1488576a6bd17fdafdb492ac344d85" exitCode=0 Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.962347 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-m4crf" event={"ID":"4afae9fe-ad3f-48d5-a095-9474568f956c","Type":"ContainerDied","Data":"507404adf0993a3824029651fe87cc72be1488576a6bd17fdafdb492ac344d85"} Dec 03 07:01:58 crc kubenswrapper[4475]: I1203 07:01:58.972562 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15c86d99-b017-4383-a20d-183a68ea3feb","Type":"ContainerStarted","Data":"fa07c81eb02ec9e20bd59ddda7d7f2f1741fd0b4bb8919aac22b62a33f839789"} Dec 03 07:01:59 crc kubenswrapper[4475]: I1203 07:01:59.209400 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:01:59 crc kubenswrapper[4475]: I1203 07:01:59.503885 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2604ecb1-3054-4b63-9a8d-3880ee519c58" path="/var/lib/kubelet/pods/2604ecb1-3054-4b63-9a8d-3880ee519c58/volumes" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.008353 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1741312d-351a-4ccd-94ef-54f59b3cf6ef","Type":"ContainerStarted","Data":"327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157"} Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.008566 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="ceilometer-central-agent" containerID="cri-o://8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd" gracePeriod=30 Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.008672 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.008992 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="proxy-httpd" containerID="cri-o://327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157" gracePeriod=30 Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.009039 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="sg-core" containerID="cri-o://8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385" gracePeriod=30 Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.009097 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="ceilometer-notification-agent" containerID="cri-o://fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924" gracePeriod=30 Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.014562 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15c86d99-b017-4383-a20d-183a68ea3feb","Type":"ContainerStarted","Data":"7b0cce1a18fcd87f5c18245c920d6c2151b572bb881579df4415f9ca7c826652"} Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.014602 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15c86d99-b017-4383-a20d-183a68ea3feb","Type":"ContainerStarted","Data":"7bbf2365746462581ca4b582a9c5121de0e24636cc60f15cbec5b9fead2f5eb3"} Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.030560 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.099948619 podStartE2EDuration="6.030543698s" podCreationTimestamp="2025-12-03 07:01:54 +0000 UTC" firstStartedPulling="2025-12-03 07:01:55.005809999 +0000 UTC m=+999.810708332" lastFinishedPulling="2025-12-03 07:01:58.936405076 +0000 UTC m=+1003.741303411" observedRunningTime="2025-12-03 07:02:00.028958968 +0000 UTC m=+1004.833857302" watchObservedRunningTime="2025-12-03 07:02:00.030543698 +0000 UTC m=+1004.835442032" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.033975 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01d6f138-2bc3-453d-b892-4feb51001eab","Type":"ContainerStarted","Data":"661a40f1270f03dc0d74c5486617a794435a704b44b9fddb9dd7fa066397a65a"} Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.034004 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01d6f138-2bc3-453d-b892-4feb51001eab","Type":"ContainerStarted","Data":"26fae83891397710bd50919e78520c1eb5698b2bf5e991d666aecf844145268e"} Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.056796 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.056778467 podStartE2EDuration="4.056778467s" podCreationTimestamp="2025-12-03 07:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:00.053801928 +0000 UTC m=+1004.858700282" watchObservedRunningTime="2025-12-03 07:02:00.056778467 +0000 UTC m=+1004.861676802" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.481016 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.601141 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-scripts\") pod \"4afae9fe-ad3f-48d5-a095-9474568f956c\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.601255 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn4lz\" (UniqueName: \"kubernetes.io/projected/4afae9fe-ad3f-48d5-a095-9474568f956c-kube-api-access-jn4lz\") pod \"4afae9fe-ad3f-48d5-a095-9474568f956c\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.601385 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-config-data\") pod \"4afae9fe-ad3f-48d5-a095-9474568f956c\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.601480 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-combined-ca-bundle\") pod \"4afae9fe-ad3f-48d5-a095-9474568f956c\" (UID: \"4afae9fe-ad3f-48d5-a095-9474568f956c\") " Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.606672 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4afae9fe-ad3f-48d5-a095-9474568f956c-kube-api-access-jn4lz" (OuterVolumeSpecName: "kube-api-access-jn4lz") pod "4afae9fe-ad3f-48d5-a095-9474568f956c" (UID: "4afae9fe-ad3f-48d5-a095-9474568f956c"). InnerVolumeSpecName "kube-api-access-jn4lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.608264 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-scripts" (OuterVolumeSpecName: "scripts") pod "4afae9fe-ad3f-48d5-a095-9474568f956c" (UID: "4afae9fe-ad3f-48d5-a095-9474568f956c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.634776 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-config-data" (OuterVolumeSpecName: "config-data") pod "4afae9fe-ad3f-48d5-a095-9474568f956c" (UID: "4afae9fe-ad3f-48d5-a095-9474568f956c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.635502 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4afae9fe-ad3f-48d5-a095-9474568f956c" (UID: "4afae9fe-ad3f-48d5-a095-9474568f956c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.703809 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.703834 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.703846 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4afae9fe-ad3f-48d5-a095-9474568f956c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:00 crc kubenswrapper[4475]: I1203 07:02:00.703855 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn4lz\" (UniqueName: \"kubernetes.io/projected/4afae9fe-ad3f-48d5-a095-9474568f956c-kube-api-access-jn4lz\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.042701 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-m4crf" event={"ID":"4afae9fe-ad3f-48d5-a095-9474568f956c","Type":"ContainerDied","Data":"4e3869b449c0f2a03cab91aaff4db98b404a0acd4977292b76bd78429d5a97e3"} Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.043009 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e3869b449c0f2a03cab91aaff4db98b404a0acd4977292b76bd78429d5a97e3" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.043072 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-m4crf" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.055344 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01d6f138-2bc3-453d-b892-4feb51001eab","Type":"ContainerStarted","Data":"975c9dbff1977aeec95d88f2aeb782e9b0e426ec94676b11da04feb155c2e0ea"} Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.058131 4475 generic.go:334] "Generic (PLEG): container finished" podID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerID="327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157" exitCode=0 Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.058152 4475 generic.go:334] "Generic (PLEG): container finished" podID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerID="8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385" exitCode=2 Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.058159 4475 generic.go:334] "Generic (PLEG): container finished" podID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerID="fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924" exitCode=0 Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.058790 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1741312d-351a-4ccd-94ef-54f59b3cf6ef","Type":"ContainerDied","Data":"327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157"} Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.058817 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1741312d-351a-4ccd-94ef-54f59b3cf6ef","Type":"ContainerDied","Data":"8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385"} Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.058829 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1741312d-351a-4ccd-94ef-54f59b3cf6ef","Type":"ContainerDied","Data":"fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924"} Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.070097 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 07:02:01 crc kubenswrapper[4475]: E1203 07:02:01.070475 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afae9fe-ad3f-48d5-a095-9474568f956c" containerName="nova-cell0-conductor-db-sync" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.070492 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afae9fe-ad3f-48d5-a095-9474568f956c" containerName="nova-cell0-conductor-db-sync" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.070691 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="4afae9fe-ad3f-48d5-a095-9474568f956c" containerName="nova-cell0-conductor-db-sync" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.071213 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.073846 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gpl6m" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.073883 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.077537 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.083163 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.08314661 podStartE2EDuration="3.08314661s" podCreationTimestamp="2025-12-03 07:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:01.082908131 +0000 UTC m=+1005.887806466" watchObservedRunningTime="2025-12-03 07:02:01.08314661 +0000 UTC m=+1005.888044943" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.216793 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4483743e-d662-4fc6-8692-f3dffefd320b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4483743e-d662-4fc6-8692-f3dffefd320b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.216861 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kljwq\" (UniqueName: \"kubernetes.io/projected/4483743e-d662-4fc6-8692-f3dffefd320b-kube-api-access-kljwq\") pod \"nova-cell0-conductor-0\" (UID: \"4483743e-d662-4fc6-8692-f3dffefd320b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.216933 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4483743e-d662-4fc6-8692-f3dffefd320b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4483743e-d662-4fc6-8692-f3dffefd320b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.319878 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4483743e-d662-4fc6-8692-f3dffefd320b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4483743e-d662-4fc6-8692-f3dffefd320b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.319964 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kljwq\" (UniqueName: \"kubernetes.io/projected/4483743e-d662-4fc6-8692-f3dffefd320b-kube-api-access-kljwq\") pod \"nova-cell0-conductor-0\" (UID: \"4483743e-d662-4fc6-8692-f3dffefd320b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.320038 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4483743e-d662-4fc6-8692-f3dffefd320b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4483743e-d662-4fc6-8692-f3dffefd320b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.326576 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4483743e-d662-4fc6-8692-f3dffefd320b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4483743e-d662-4fc6-8692-f3dffefd320b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.340830 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4483743e-d662-4fc6-8692-f3dffefd320b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4483743e-d662-4fc6-8692-f3dffefd320b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.343285 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kljwq\" (UniqueName: \"kubernetes.io/projected/4483743e-d662-4fc6-8692-f3dffefd320b-kube-api-access-kljwq\") pod \"nova-cell0-conductor-0\" (UID: \"4483743e-d662-4fc6-8692-f3dffefd320b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.389934 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:01 crc kubenswrapper[4475]: I1203 07:02:01.787204 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 07:02:02 crc kubenswrapper[4475]: I1203 07:02:02.073242 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4483743e-d662-4fc6-8692-f3dffefd320b","Type":"ContainerStarted","Data":"2bb059b737e01e2c65fd1c1a0b56eb7b00083960a47c3d5f14e86559311717de"} Dec 03 07:02:02 crc kubenswrapper[4475]: I1203 07:02:02.073508 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4483743e-d662-4fc6-8692-f3dffefd320b","Type":"ContainerStarted","Data":"d34fe90ed31c588e2c35e4b0b9d3ba999fefbc0330a07429473cc128b2ad6452"} Dec 03 07:02:02 crc kubenswrapper[4475]: I1203 07:02:02.073524 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:02 crc kubenswrapper[4475]: I1203 07:02:02.091132 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.091116624 podStartE2EDuration="1.091116624s" podCreationTimestamp="2025-12-03 07:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:02.086233228 +0000 UTC m=+1006.891131563" watchObservedRunningTime="2025-12-03 07:02:02.091116624 +0000 UTC m=+1006.896014958" Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.905989 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.993094 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97gpz\" (UniqueName: \"kubernetes.io/projected/1741312d-351a-4ccd-94ef-54f59b3cf6ef-kube-api-access-97gpz\") pod \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.993254 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-log-httpd\") pod \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.993324 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-run-httpd\") pod \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.993350 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-sg-core-conf-yaml\") pod \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.993432 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-config-data\") pod \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.993492 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-combined-ca-bundle\") pod \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.993555 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-scripts\") pod \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\" (UID: \"1741312d-351a-4ccd-94ef-54f59b3cf6ef\") " Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.993871 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1741312d-351a-4ccd-94ef-54f59b3cf6ef" (UID: "1741312d-351a-4ccd-94ef-54f59b3cf6ef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.993982 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1741312d-351a-4ccd-94ef-54f59b3cf6ef" (UID: "1741312d-351a-4ccd-94ef-54f59b3cf6ef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.994439 4475 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:04 crc kubenswrapper[4475]: I1203 07:02:04.994478 4475 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1741312d-351a-4ccd-94ef-54f59b3cf6ef-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.003613 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1741312d-351a-4ccd-94ef-54f59b3cf6ef-kube-api-access-97gpz" (OuterVolumeSpecName: "kube-api-access-97gpz") pod "1741312d-351a-4ccd-94ef-54f59b3cf6ef" (UID: "1741312d-351a-4ccd-94ef-54f59b3cf6ef"). InnerVolumeSpecName "kube-api-access-97gpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.017564 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-scripts" (OuterVolumeSpecName: "scripts") pod "1741312d-351a-4ccd-94ef-54f59b3cf6ef" (UID: "1741312d-351a-4ccd-94ef-54f59b3cf6ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.043658 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1741312d-351a-4ccd-94ef-54f59b3cf6ef" (UID: "1741312d-351a-4ccd-94ef-54f59b3cf6ef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.098661 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97gpz\" (UniqueName: \"kubernetes.io/projected/1741312d-351a-4ccd-94ef-54f59b3cf6ef-kube-api-access-97gpz\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.098707 4475 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.098719 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.111532 4475 generic.go:334] "Generic (PLEG): container finished" podID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerID="8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd" exitCode=0 Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.111576 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1741312d-351a-4ccd-94ef-54f59b3cf6ef","Type":"ContainerDied","Data":"8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd"} Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.111610 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1741312d-351a-4ccd-94ef-54f59b3cf6ef","Type":"ContainerDied","Data":"e0d9a053e4a3a8b7e04fcc752f2fb4d1824140fe6d383ecdbb4ab5858a63cf26"} Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.111628 4475 scope.go:117] "RemoveContainer" containerID="327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.111761 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.133640 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1741312d-351a-4ccd-94ef-54f59b3cf6ef" (UID: "1741312d-351a-4ccd-94ef-54f59b3cf6ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.136167 4475 scope.go:117] "RemoveContainer" containerID="8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.167152 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-config-data" (OuterVolumeSpecName: "config-data") pod "1741312d-351a-4ccd-94ef-54f59b3cf6ef" (UID: "1741312d-351a-4ccd-94ef-54f59b3cf6ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.167262 4475 scope.go:117] "RemoveContainer" containerID="fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.202142 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.202165 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1741312d-351a-4ccd-94ef-54f59b3cf6ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.213481 4475 scope.go:117] "RemoveContainer" containerID="8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.269207 4475 scope.go:117] "RemoveContainer" containerID="327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157" Dec 03 07:02:05 crc kubenswrapper[4475]: E1203 07:02:05.269794 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157\": container with ID starting with 327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157 not found: ID does not exist" containerID="327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.269842 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157"} err="failed to get container status \"327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157\": rpc error: code = NotFound desc = could not find container \"327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157\": container with ID starting with 327c3630d09aee3f4d470185c07f75b9d0184809bbdb553976b8687927901157 not found: ID does not exist" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.269871 4475 scope.go:117] "RemoveContainer" containerID="8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385" Dec 03 07:02:05 crc kubenswrapper[4475]: E1203 07:02:05.270331 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385\": container with ID starting with 8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385 not found: ID does not exist" containerID="8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.270375 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385"} err="failed to get container status \"8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385\": rpc error: code = NotFound desc = could not find container \"8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385\": container with ID starting with 8c5573d2a2f0f3b0a5dba3ae1c3fdff779ea64e34d75af95028297dfe5c64385 not found: ID does not exist" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.270392 4475 scope.go:117] "RemoveContainer" containerID="fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924" Dec 03 07:02:05 crc kubenswrapper[4475]: E1203 07:02:05.270788 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924\": container with ID starting with fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924 not found: ID does not exist" containerID="fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.270805 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924"} err="failed to get container status \"fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924\": rpc error: code = NotFound desc = could not find container \"fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924\": container with ID starting with fcfd72daaa97f09114b59b182905505ba61487e827dece3a8af63f5cdb5fd924 not found: ID does not exist" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.270818 4475 scope.go:117] "RemoveContainer" containerID="8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd" Dec 03 07:02:05 crc kubenswrapper[4475]: E1203 07:02:05.271535 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd\": container with ID starting with 8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd not found: ID does not exist" containerID="8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.271574 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd"} err="failed to get container status \"8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd\": rpc error: code = NotFound desc = could not find container \"8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd\": container with ID starting with 8714fea690c3fd485422856e6da4f4067369bcb9bff0b561ff003ebb9fa6b7cd not found: ID does not exist" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.439325 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.487008 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.517842 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" path="/var/lib/kubelet/pods/1741312d-351a-4ccd-94ef-54f59b3cf6ef/volumes" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.518830 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:05 crc kubenswrapper[4475]: E1203 07:02:05.519208 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="proxy-httpd" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.519281 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="proxy-httpd" Dec 03 07:02:05 crc kubenswrapper[4475]: E1203 07:02:05.519354 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="ceilometer-notification-agent" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.519405 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="ceilometer-notification-agent" Dec 03 07:02:05 crc kubenswrapper[4475]: E1203 07:02:05.519520 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="ceilometer-central-agent" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.519574 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="ceilometer-central-agent" Dec 03 07:02:05 crc kubenswrapper[4475]: E1203 07:02:05.519644 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="sg-core" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.519689 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="sg-core" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.519918 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="sg-core" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.519973 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="proxy-httpd" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.520035 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="ceilometer-notification-agent" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.520108 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="1741312d-351a-4ccd-94ef-54f59b3cf6ef" containerName="ceilometer-central-agent" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.521866 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.522003 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.527630 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.527866 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.607997 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.608280 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-scripts\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.608408 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7tlr\" (UniqueName: \"kubernetes.io/projected/206cd7f6-428b-4bcf-974c-74a1242401d1-kube-api-access-f7tlr\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.608494 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.608534 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-run-httpd\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.608605 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-log-httpd\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.609164 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-config-data\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.710748 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-config-data\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.710793 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.710829 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-scripts\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.710881 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7tlr\" (UniqueName: \"kubernetes.io/projected/206cd7f6-428b-4bcf-974c-74a1242401d1-kube-api-access-f7tlr\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.710916 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.710937 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-run-httpd\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.710972 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-log-httpd\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.711415 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-log-httpd\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.711624 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-run-httpd\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.716587 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.718387 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-config-data\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.718964 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-scripts\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.721010 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.730166 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7tlr\" (UniqueName: \"kubernetes.io/projected/206cd7f6-428b-4bcf-974c-74a1242401d1-kube-api-access-f7tlr\") pod \"ceilometer-0\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " pod="openstack/ceilometer-0" Dec 03 07:02:05 crc kubenswrapper[4475]: I1203 07:02:05.835611 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:06 crc kubenswrapper[4475]: I1203 07:02:06.348030 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:06 crc kubenswrapper[4475]: W1203 07:02:06.352163 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod206cd7f6_428b_4bcf_974c_74a1242401d1.slice/crio-e33bc16bcbac74b2e202fc4ca828cc5cc3ea8a2db3ade2b1e3cc5fd37068bd91 WatchSource:0}: Error finding container e33bc16bcbac74b2e202fc4ca828cc5cc3ea8a2db3ade2b1e3cc5fd37068bd91: Status 404 returned error can't find the container with id e33bc16bcbac74b2e202fc4ca828cc5cc3ea8a2db3ade2b1e3cc5fd37068bd91 Dec 03 07:02:06 crc kubenswrapper[4475]: I1203 07:02:06.413884 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 07:02:06 crc kubenswrapper[4475]: I1203 07:02:06.880089 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-5gv82"] Dec 03 07:02:06 crc kubenswrapper[4475]: I1203 07:02:06.881692 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:06 crc kubenswrapper[4475]: I1203 07:02:06.883466 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 07:02:06 crc kubenswrapper[4475]: I1203 07:02:06.883899 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 07:02:06 crc kubenswrapper[4475]: I1203 07:02:06.888112 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5gv82"] Dec 03 07:02:06 crc kubenswrapper[4475]: I1203 07:02:06.942108 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-scripts\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:06 crc kubenswrapper[4475]: I1203 07:02:06.942241 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-config-data\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:06 crc kubenswrapper[4475]: I1203 07:02:06.942506 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:06 crc kubenswrapper[4475]: I1203 07:02:06.942669 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x48xr\" (UniqueName: \"kubernetes.io/projected/08a09e1c-191b-46b4-92d7-dd92fb839342-kube-api-access-x48xr\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.045948 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.046043 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x48xr\" (UniqueName: \"kubernetes.io/projected/08a09e1c-191b-46b4-92d7-dd92fb839342-kube-api-access-x48xr\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.046274 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-scripts\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.046351 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-config-data\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.083190 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-config-data\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.085339 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.088615 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-scripts\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.089098 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x48xr\" (UniqueName: \"kubernetes.io/projected/08a09e1c-191b-46b4-92d7-dd92fb839342-kube-api-access-x48xr\") pod \"nova-cell0-cell-mapping-5gv82\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.169215 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206cd7f6-428b-4bcf-974c-74a1242401d1","Type":"ContainerStarted","Data":"e33bc16bcbac74b2e202fc4ca828cc5cc3ea8a2db3ade2b1e3cc5fd37068bd91"} Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.173545 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.175371 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.180239 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.208840 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.237128 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.260835 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.260974 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kmmz\" (UniqueName: \"kubernetes.io/projected/a9ae60aa-857f-48c8-80d0-a6f3d7490395-kube-api-access-4kmmz\") pod \"nova-scheduler-0\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.261036 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-config-data\") pod \"nova-scheduler-0\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.261878 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.263440 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.278195 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.302312 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.302372 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.318680 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.361421 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.363127 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.367018 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.367131 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kmmz\" (UniqueName: \"kubernetes.io/projected/a9ae60aa-857f-48c8-80d0-a6f3d7490395-kube-api-access-4kmmz\") pod \"nova-scheduler-0\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.367182 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-config-data\") pod \"nova-scheduler-0\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.367219 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.367311 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wzbf\" (UniqueName: \"kubernetes.io/projected/52e2f226-298b-4a85-98a5-abbb1a72320f-kube-api-access-8wzbf\") pod \"nova-cell1-novncproxy-0\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.367367 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.376543 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.380863 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.382845 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-config-data\") pod \"nova-scheduler-0\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.398179 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.402740 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.416164 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.416993 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kmmz\" (UniqueName: \"kubernetes.io/projected/a9ae60aa-857f-48c8-80d0-a6f3d7490395-kube-api-access-4kmmz\") pod \"nova-scheduler-0\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.427482 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.447048 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.451503 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.474204 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wzbf\" (UniqueName: \"kubernetes.io/projected/52e2f226-298b-4a85-98a5-abbb1a72320f-kube-api-access-8wzbf\") pod \"nova-cell1-novncproxy-0\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.474325 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.474644 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.480577 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.490366 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.578646 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9gb\" (UniqueName: \"kubernetes.io/projected/e6e2659c-bdb8-4015-b749-9d1bfd70620f-kube-api-access-sp9gb\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.579084 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1091a855-d763-4e97-aa03-66e64d9aae45-logs\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.579679 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.579829 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw578\" (UniqueName: \"kubernetes.io/projected/1091a855-d763-4e97-aa03-66e64d9aae45-kube-api-access-tw578\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.579944 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.580479 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e2659c-bdb8-4015-b749-9d1bfd70620f-logs\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.580736 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-config-data\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.580877 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-config-data\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.583865 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wzbf\" (UniqueName: \"kubernetes.io/projected/52e2f226-298b-4a85-98a5-abbb1a72320f-kube-api-access-8wzbf\") pod \"nova-cell1-novncproxy-0\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.588664 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.601822 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.640843 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.688526 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c696b9c6c-wpmmb"] Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.690310 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.690867 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e2659c-bdb8-4015-b749-9d1bfd70620f-logs\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.691892 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-config-data\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.694418 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e2659c-bdb8-4015-b749-9d1bfd70620f-logs\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.705517 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-config-data\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.705917 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-config-data\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.706269 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9gb\" (UniqueName: \"kubernetes.io/projected/e6e2659c-bdb8-4015-b749-9d1bfd70620f-kube-api-access-sp9gb\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.707218 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1091a855-d763-4e97-aa03-66e64d9aae45-logs\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.707892 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1091a855-d763-4e97-aa03-66e64d9aae45-logs\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.708606 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.710396 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw578\" (UniqueName: \"kubernetes.io/projected/1091a855-d763-4e97-aa03-66e64d9aae45-kube-api-access-tw578\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.710619 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.710769 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-config-data\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.720939 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.723845 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.726556 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c696b9c6c-wpmmb"] Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.757890 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw578\" (UniqueName: \"kubernetes.io/projected/1091a855-d763-4e97-aa03-66e64d9aae45-kube-api-access-tw578\") pod \"nova-api-0\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.766405 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.775036 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9gb\" (UniqueName: \"kubernetes.io/projected/e6e2659c-bdb8-4015-b749-9d1bfd70620f-kube-api-access-sp9gb\") pod \"nova-metadata-0\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.813289 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-svc\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.813464 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-swift-storage-0\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.813550 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-sb\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.813634 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96r24\" (UniqueName: \"kubernetes.io/projected/06823879-29ce-4d3b-bd43-61c0891eaa99-kube-api-access-96r24\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.813684 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-nb\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.813720 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-config\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.916537 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-swift-storage-0\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.916645 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-sb\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.916740 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96r24\" (UniqueName: \"kubernetes.io/projected/06823879-29ce-4d3b-bd43-61c0891eaa99-kube-api-access-96r24\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.916798 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-nb\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.916835 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-config\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.916943 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-svc\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.918241 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-sb\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.918291 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-swift-storage-0\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.918901 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-nb\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.919333 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-config\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.922388 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-svc\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:07 crc kubenswrapper[4475]: I1203 07:02:07.959021 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96r24\" (UniqueName: \"kubernetes.io/projected/06823879-29ce-4d3b-bd43-61c0891eaa99-kube-api-access-96r24\") pod \"dnsmasq-dns-6c696b9c6c-wpmmb\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.003705 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.010509 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5gv82"] Dec 03 07:02:08 crc kubenswrapper[4475]: W1203 07:02:08.014661 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08a09e1c_191b_46b4_92d7_dd92fb839342.slice/crio-6537775dc71d2cfe924826786bae0fa0522f0b9ab3ccb5ccd8dc4f04634eb5f7 WatchSource:0}: Error finding container 6537775dc71d2cfe924826786bae0fa0522f0b9ab3ccb5ccd8dc4f04634eb5f7: Status 404 returned error can't find the container with id 6537775dc71d2cfe924826786bae0fa0522f0b9ab3ccb5ccd8dc4f04634eb5f7 Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.199798 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206cd7f6-428b-4bcf-974c-74a1242401d1","Type":"ContainerStarted","Data":"c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc"} Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.208319 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5gv82" event={"ID":"08a09e1c-191b-46b4-92d7-dd92fb839342","Type":"ContainerStarted","Data":"6537775dc71d2cfe924826786bae0fa0522f0b9ab3ccb5ccd8dc4f04634eb5f7"} Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.208682 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.209754 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.236170 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.461771 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.591877 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:08 crc kubenswrapper[4475]: W1203 07:02:08.612600 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e2659c_bdb8_4015_b749_9d1bfd70620f.slice/crio-f38adfa30fd4ebb78cee694486062d19d33d13cf2c50659bb9a0ff66b6d44c52 WatchSource:0}: Error finding container f38adfa30fd4ebb78cee694486062d19d33d13cf2c50659bb9a0ff66b6d44c52: Status 404 returned error can't find the container with id f38adfa30fd4ebb78cee694486062d19d33d13cf2c50659bb9a0ff66b6d44c52 Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.621606 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.622367 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.664517 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.684104 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.742890 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.745750 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 07:02:08 crc kubenswrapper[4475]: W1203 07:02:08.943364 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06823879_29ce_4d3b_bd43_61c0891eaa99.slice/crio-18d3683412f8546b6b2f323ae8f44a456c467a869e2333c80700c7568f56cade WatchSource:0}: Error finding container 18d3683412f8546b6b2f323ae8f44a456c467a869e2333c80700c7568f56cade: Status 404 returned error can't find the container with id 18d3683412f8546b6b2f323ae8f44a456c467a869e2333c80700c7568f56cade Dec 03 07:02:08 crc kubenswrapper[4475]: I1203 07:02:08.951565 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c696b9c6c-wpmmb"] Dec 03 07:02:09 crc kubenswrapper[4475]: I1203 07:02:09.251585 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5gv82" event={"ID":"08a09e1c-191b-46b4-92d7-dd92fb839342","Type":"ContainerStarted","Data":"a7c74056fcf164a6b7666d7fe9f0e04744b03a9e45ac1c3b7f550c0b142ad819"} Dec 03 07:02:09 crc kubenswrapper[4475]: I1203 07:02:09.266442 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52e2f226-298b-4a85-98a5-abbb1a72320f","Type":"ContainerStarted","Data":"2974e92edfb0b79f84a191e3abd2e03c1824eb064edbd93f454abd7fe11ff3ba"} Dec 03 07:02:09 crc kubenswrapper[4475]: I1203 07:02:09.268426 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-5gv82" podStartSLOduration=3.26840946 podStartE2EDuration="3.26840946s" podCreationTimestamp="2025-12-03 07:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:09.265542909 +0000 UTC m=+1014.070441243" watchObservedRunningTime="2025-12-03 07:02:09.26840946 +0000 UTC m=+1014.073307794" Dec 03 07:02:09 crc kubenswrapper[4475]: I1203 07:02:09.288743 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206cd7f6-428b-4bcf-974c-74a1242401d1","Type":"ContainerStarted","Data":"acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef"} Dec 03 07:02:09 crc kubenswrapper[4475]: I1203 07:02:09.298157 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" event={"ID":"06823879-29ce-4d3b-bd43-61c0891eaa99","Type":"ContainerStarted","Data":"18d3683412f8546b6b2f323ae8f44a456c467a869e2333c80700c7568f56cade"} Dec 03 07:02:09 crc kubenswrapper[4475]: I1203 07:02:09.312207 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1091a855-d763-4e97-aa03-66e64d9aae45","Type":"ContainerStarted","Data":"97462c3210d5b4ab38efb70aee484825f6072b81eeaaf5d7ed794aa5cbcd36b5"} Dec 03 07:02:09 crc kubenswrapper[4475]: I1203 07:02:09.316799 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a9ae60aa-857f-48c8-80d0-a6f3d7490395","Type":"ContainerStarted","Data":"37690707ca7b02167e57f21a4813bae1885193e97d71bb7f4e3c02367c2e4459"} Dec 03 07:02:09 crc kubenswrapper[4475]: I1203 07:02:09.374257 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6e2659c-bdb8-4015-b749-9d1bfd70620f","Type":"ContainerStarted","Data":"f38adfa30fd4ebb78cee694486062d19d33d13cf2c50659bb9a0ff66b6d44c52"} Dec 03 07:02:09 crc kubenswrapper[4475]: I1203 07:02:09.375527 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 07:02:09 crc kubenswrapper[4475]: I1203 07:02:09.375780 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.188796 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.417152 4475 generic.go:334] "Generic (PLEG): container finished" podID="06823879-29ce-4d3b-bd43-61c0891eaa99" containerID="2d894f84e425ac28d916a64b73383e2bde66ae286d5963495998e9b37a054ff8" exitCode=0 Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.417252 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" event={"ID":"06823879-29ce-4d3b-bd43-61c0891eaa99","Type":"ContainerDied","Data":"2d894f84e425ac28d916a64b73383e2bde66ae286d5963495998e9b37a054ff8"} Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.458754 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206cd7f6-428b-4bcf-974c-74a1242401d1","Type":"ContainerStarted","Data":"d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477"} Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.459124 4475 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.459152 4475 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.588614 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4x9fx"] Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.590742 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.595766 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.597480 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.627980 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.628294 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgrgw\" (UniqueName: \"kubernetes.io/projected/b51cbc3f-c89a-4e16-814c-381aa017a61f-kube-api-access-vgrgw\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.628342 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-config-data\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.628371 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-scripts\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.643677 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4x9fx"] Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.730519 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgrgw\" (UniqueName: \"kubernetes.io/projected/b51cbc3f-c89a-4e16-814c-381aa017a61f-kube-api-access-vgrgw\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.730564 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-config-data\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.730589 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-scripts\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.730637 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.737972 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.739845 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-config-data\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.744759 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-scripts\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.760922 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgrgw\" (UniqueName: \"kubernetes.io/projected/b51cbc3f-c89a-4e16-814c-381aa017a61f-kube-api-access-vgrgw\") pod \"nova-cell1-conductor-db-sync-4x9fx\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:10 crc kubenswrapper[4475]: I1203 07:02:10.909677 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:11 crc kubenswrapper[4475]: I1203 07:02:11.476806 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" event={"ID":"06823879-29ce-4d3b-bd43-61c0891eaa99","Type":"ContainerStarted","Data":"446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019"} Dec 03 07:02:11 crc kubenswrapper[4475]: I1203 07:02:11.477575 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:11 crc kubenswrapper[4475]: I1203 07:02:11.480826 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4x9fx"] Dec 03 07:02:11 crc kubenswrapper[4475]: I1203 07:02:11.498436 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" podStartSLOduration=4.4984158690000005 podStartE2EDuration="4.498415869s" podCreationTimestamp="2025-12-03 07:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:11.495240776 +0000 UTC m=+1016.300139130" watchObservedRunningTime="2025-12-03 07:02:11.498415869 +0000 UTC m=+1016.303314203" Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.059983 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.089268 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.544205 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4x9fx" event={"ID":"b51cbc3f-c89a-4e16-814c-381aa017a61f","Type":"ContainerStarted","Data":"339aebb2858f4681fbc7b76e2fb8459c2009b3951b391a0ec64a809d7d17f4ed"} Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.544263 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4x9fx" event={"ID":"b51cbc3f-c89a-4e16-814c-381aa017a61f","Type":"ContainerStarted","Data":"1847957421a5e817acf0766395a5e6eccad9efbcae1a69505f6f68ca3776d767"} Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.564776 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-4x9fx" podStartSLOduration=2.564763523 podStartE2EDuration="2.564763523s" podCreationTimestamp="2025-12-03 07:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:12.561727813 +0000 UTC m=+1017.366626147" watchObservedRunningTime="2025-12-03 07:02:12.564763523 +0000 UTC m=+1017.369661857" Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.568947 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="ceilometer-central-agent" containerID="cri-o://c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc" gracePeriod=30 Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.569493 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206cd7f6-428b-4bcf-974c-74a1242401d1","Type":"ContainerStarted","Data":"b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24"} Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.569554 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.569988 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="proxy-httpd" containerID="cri-o://b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24" gracePeriod=30 Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.570098 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="sg-core" containerID="cri-o://d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477" gracePeriod=30 Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.570149 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="ceilometer-notification-agent" containerID="cri-o://acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef" gracePeriod=30 Dec 03 07:02:12 crc kubenswrapper[4475]: I1203 07:02:12.611590 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.365775364 podStartE2EDuration="7.611571732s" podCreationTimestamp="2025-12-03 07:02:05 +0000 UTC" firstStartedPulling="2025-12-03 07:02:06.354411235 +0000 UTC m=+1011.159309569" lastFinishedPulling="2025-12-03 07:02:11.600207613 +0000 UTC m=+1016.405105937" observedRunningTime="2025-12-03 07:02:12.598632602 +0000 UTC m=+1017.403530937" watchObservedRunningTime="2025-12-03 07:02:12.611571732 +0000 UTC m=+1017.416470066" Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.098728 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.099045 4475 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.104019 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.396532 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.396641 4475 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.583605 4475 generic.go:334] "Generic (PLEG): container finished" podID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerID="b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24" exitCode=0 Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.583647 4475 generic.go:334] "Generic (PLEG): container finished" podID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerID="d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477" exitCode=2 Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.583655 4475 generic.go:334] "Generic (PLEG): container finished" podID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerID="acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef" exitCode=0 Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.584244 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206cd7f6-428b-4bcf-974c-74a1242401d1","Type":"ContainerDied","Data":"b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24"} Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.584270 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206cd7f6-428b-4bcf-974c-74a1242401d1","Type":"ContainerDied","Data":"d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477"} Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.584281 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206cd7f6-428b-4bcf-974c-74a1242401d1","Type":"ContainerDied","Data":"acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef"} Dec 03 07:02:13 crc kubenswrapper[4475]: I1203 07:02:13.828567 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 07:02:15 crc kubenswrapper[4475]: I1203 07:02:15.619860 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a9ae60aa-857f-48c8-80d0-a6f3d7490395","Type":"ContainerStarted","Data":"995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26"} Dec 03 07:02:15 crc kubenswrapper[4475]: I1203 07:02:15.623132 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6e2659c-bdb8-4015-b749-9d1bfd70620f","Type":"ContainerStarted","Data":"737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371"} Dec 03 07:02:15 crc kubenswrapper[4475]: I1203 07:02:15.647669 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52e2f226-298b-4a85-98a5-abbb1a72320f","Type":"ContainerStarted","Data":"2ed38d76695ea0fe62cd693f8f4977e088e10bb5abd97b20bb2c778f25c5dcec"} Dec 03 07:02:15 crc kubenswrapper[4475]: I1203 07:02:15.647927 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="52e2f226-298b-4a85-98a5-abbb1a72320f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2ed38d76695ea0fe62cd693f8f4977e088e10bb5abd97b20bb2c778f25c5dcec" gracePeriod=30 Dec 03 07:02:15 crc kubenswrapper[4475]: I1203 07:02:15.734462 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.175922238 podStartE2EDuration="8.734426595s" podCreationTimestamp="2025-12-03 07:02:07 +0000 UTC" firstStartedPulling="2025-12-03 07:02:08.516158895 +0000 UTC m=+1013.321057229" lastFinishedPulling="2025-12-03 07:02:15.074663253 +0000 UTC m=+1019.879561586" observedRunningTime="2025-12-03 07:02:15.683931572 +0000 UTC m=+1020.488829906" watchObservedRunningTime="2025-12-03 07:02:15.734426595 +0000 UTC m=+1020.539324930" Dec 03 07:02:15 crc kubenswrapper[4475]: I1203 07:02:15.777567 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.442237127 podStartE2EDuration="8.777548359s" podCreationTimestamp="2025-12-03 07:02:07 +0000 UTC" firstStartedPulling="2025-12-03 07:02:08.74137721 +0000 UTC m=+1013.546275544" lastFinishedPulling="2025-12-03 07:02:15.076688442 +0000 UTC m=+1019.881586776" observedRunningTime="2025-12-03 07:02:15.716729396 +0000 UTC m=+1020.521627730" watchObservedRunningTime="2025-12-03 07:02:15.777548359 +0000 UTC m=+1020.582446703" Dec 03 07:02:16 crc kubenswrapper[4475]: I1203 07:02:16.657560 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1091a855-d763-4e97-aa03-66e64d9aae45","Type":"ContainerStarted","Data":"90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0"} Dec 03 07:02:16 crc kubenswrapper[4475]: I1203 07:02:16.657877 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1091a855-d763-4e97-aa03-66e64d9aae45","Type":"ContainerStarted","Data":"d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64"} Dec 03 07:02:16 crc kubenswrapper[4475]: I1203 07:02:16.659630 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6e2659c-bdb8-4015-b749-9d1bfd70620f","Type":"ContainerStarted","Data":"f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a"} Dec 03 07:02:16 crc kubenswrapper[4475]: I1203 07:02:16.659730 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e6e2659c-bdb8-4015-b749-9d1bfd70620f" containerName="nova-metadata-log" containerID="cri-o://737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371" gracePeriod=30 Dec 03 07:02:16 crc kubenswrapper[4475]: I1203 07:02:16.659773 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e6e2659c-bdb8-4015-b749-9d1bfd70620f" containerName="nova-metadata-metadata" containerID="cri-o://f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a" gracePeriod=30 Dec 03 07:02:16 crc kubenswrapper[4475]: I1203 07:02:16.708166 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.303408618 podStartE2EDuration="9.708134452s" podCreationTimestamp="2025-12-03 07:02:07 +0000 UTC" firstStartedPulling="2025-12-03 07:02:08.670991482 +0000 UTC m=+1013.475889805" lastFinishedPulling="2025-12-03 07:02:15.075717305 +0000 UTC m=+1019.880615639" observedRunningTime="2025-12-03 07:02:16.683598448 +0000 UTC m=+1021.488496782" watchObservedRunningTime="2025-12-03 07:02:16.708134452 +0000 UTC m=+1021.513032786" Dec 03 07:02:16 crc kubenswrapper[4475]: I1203 07:02:16.712593 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.286135868 podStartE2EDuration="9.712579984s" podCreationTimestamp="2025-12-03 07:02:07 +0000 UTC" firstStartedPulling="2025-12-03 07:02:08.645917186 +0000 UTC m=+1013.450815520" lastFinishedPulling="2025-12-03 07:02:15.072361302 +0000 UTC m=+1019.877259636" observedRunningTime="2025-12-03 07:02:16.703900598 +0000 UTC m=+1021.508798932" watchObservedRunningTime="2025-12-03 07:02:16.712579984 +0000 UTC m=+1021.517478318" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.518023 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.518292 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.532625 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.602361 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.607797 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.668144 4475 generic.go:334] "Generic (PLEG): container finished" podID="e6e2659c-bdb8-4015-b749-9d1bfd70620f" containerID="f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a" exitCode=0 Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.668171 4475 generic.go:334] "Generic (PLEG): container finished" podID="e6e2659c-bdb8-4015-b749-9d1bfd70620f" containerID="737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371" exitCode=143 Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.668190 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6e2659c-bdb8-4015-b749-9d1bfd70620f","Type":"ContainerDied","Data":"f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a"} Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.668229 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.668251 4475 scope.go:117] "RemoveContainer" containerID="f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.668235 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6e2659c-bdb8-4015-b749-9d1bfd70620f","Type":"ContainerDied","Data":"737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371"} Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.668292 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6e2659c-bdb8-4015-b749-9d1bfd70620f","Type":"ContainerDied","Data":"f38adfa30fd4ebb78cee694486062d19d33d13cf2c50659bb9a0ff66b6d44c52"} Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.694506 4475 scope.go:117] "RemoveContainer" containerID="737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.699959 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.723416 4475 scope.go:117] "RemoveContainer" containerID="f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a" Dec 03 07:02:17 crc kubenswrapper[4475]: E1203 07:02:17.723927 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a\": container with ID starting with f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a not found: ID does not exist" containerID="f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.723961 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a"} err="failed to get container status \"f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a\": rpc error: code = NotFound desc = could not find container \"f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a\": container with ID starting with f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a not found: ID does not exist" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.723981 4475 scope.go:117] "RemoveContainer" containerID="737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371" Dec 03 07:02:17 crc kubenswrapper[4475]: E1203 07:02:17.724215 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371\": container with ID starting with 737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371 not found: ID does not exist" containerID="737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.724237 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371"} err="failed to get container status \"737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371\": rpc error: code = NotFound desc = could not find container \"737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371\": container with ID starting with 737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371 not found: ID does not exist" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.724252 4475 scope.go:117] "RemoveContainer" containerID="f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.724468 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a"} err="failed to get container status \"f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a\": rpc error: code = NotFound desc = could not find container \"f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a\": container with ID starting with f245f9ad977b3aebdc343616c6d31736509406d370ad2981b0e5ebf7d0e01d9a not found: ID does not exist" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.724495 4475 scope.go:117] "RemoveContainer" containerID="737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.725562 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371"} err="failed to get container status \"737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371\": rpc error: code = NotFound desc = could not find container \"737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371\": container with ID starting with 737bcd378dab2dbe69460b0840fe1bfe60f65e3193dd02ee5c886bd4a13e3371 not found: ID does not exist" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.765877 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-combined-ca-bundle\") pod \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.766047 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e2659c-bdb8-4015-b749-9d1bfd70620f-logs\") pod \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.766115 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp9gb\" (UniqueName: \"kubernetes.io/projected/e6e2659c-bdb8-4015-b749-9d1bfd70620f-kube-api-access-sp9gb\") pod \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.766153 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-config-data\") pod \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\" (UID: \"e6e2659c-bdb8-4015-b749-9d1bfd70620f\") " Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.766696 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6e2659c-bdb8-4015-b749-9d1bfd70620f-logs" (OuterVolumeSpecName: "logs") pod "e6e2659c-bdb8-4015-b749-9d1bfd70620f" (UID: "e6e2659c-bdb8-4015-b749-9d1bfd70620f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.767327 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.767636 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.788564 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e2659c-bdb8-4015-b749-9d1bfd70620f-kube-api-access-sp9gb" (OuterVolumeSpecName: "kube-api-access-sp9gb") pod "e6e2659c-bdb8-4015-b749-9d1bfd70620f" (UID: "e6e2659c-bdb8-4015-b749-9d1bfd70620f"). InnerVolumeSpecName "kube-api-access-sp9gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.799140 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-config-data" (OuterVolumeSpecName: "config-data") pod "e6e2659c-bdb8-4015-b749-9d1bfd70620f" (UID: "e6e2659c-bdb8-4015-b749-9d1bfd70620f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.858524 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6e2659c-bdb8-4015-b749-9d1bfd70620f" (UID: "e6e2659c-bdb8-4015-b749-9d1bfd70620f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.869242 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.869293 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e2659c-bdb8-4015-b749-9d1bfd70620f-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.869305 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp9gb\" (UniqueName: \"kubernetes.io/projected/e6e2659c-bdb8-4015-b749-9d1bfd70620f-kube-api-access-sp9gb\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.869317 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e2659c-bdb8-4015-b749-9d1bfd70620f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:17 crc kubenswrapper[4475]: I1203 07:02:17.994471 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.003417 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.017232 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:18 crc kubenswrapper[4475]: E1203 07:02:18.017565 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e2659c-bdb8-4015-b749-9d1bfd70620f" containerName="nova-metadata-log" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.017583 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e2659c-bdb8-4015-b749-9d1bfd70620f" containerName="nova-metadata-log" Dec 03 07:02:18 crc kubenswrapper[4475]: E1203 07:02:18.017597 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e2659c-bdb8-4015-b749-9d1bfd70620f" containerName="nova-metadata-metadata" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.017602 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e2659c-bdb8-4015-b749-9d1bfd70620f" containerName="nova-metadata-metadata" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.017772 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e2659c-bdb8-4015-b749-9d1bfd70620f" containerName="nova-metadata-metadata" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.017800 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e2659c-bdb8-4015-b749-9d1bfd70620f" containerName="nova-metadata-log" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.018630 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.020348 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.022683 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.036476 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.182504 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.182561 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-logs\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.182639 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4448\" (UniqueName: \"kubernetes.io/projected/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-kube-api-access-t4448\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.182683 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-config-data\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.182837 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.238620 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.286514 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.286630 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.286652 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-logs\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.286680 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4448\" (UniqueName: \"kubernetes.io/projected/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-kube-api-access-t4448\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.286695 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-config-data\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.287714 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-logs\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.304936 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4448\" (UniqueName: \"kubernetes.io/projected/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-kube-api-access-t4448\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.306875 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.313967 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-config-data\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.318391 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.332478 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-664766cd5c-v6774"] Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.332721 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-664766cd5c-v6774" podUID="f5472efc-49e7-4b70-9084-da3a969617ab" containerName="dnsmasq-dns" containerID="cri-o://698e47c272c2b90493405777f693fd83df0b75195f388a0cc0e50059bbf82f9e" gracePeriod=10 Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.332956 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.677897 4475 generic.go:334] "Generic (PLEG): container finished" podID="f5472efc-49e7-4b70-9084-da3a969617ab" containerID="698e47c272c2b90493405777f693fd83df0b75195f388a0cc0e50059bbf82f9e" exitCode=0 Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.678283 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664766cd5c-v6774" event={"ID":"f5472efc-49e7-4b70-9084-da3a969617ab","Type":"ContainerDied","Data":"698e47c272c2b90493405777f693fd83df0b75195f388a0cc0e50059bbf82f9e"} Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.851628 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1091a855-d763-4e97-aa03-66e64d9aae45" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.851845 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1091a855-d763-4e97-aa03-66e64d9aae45" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:02:18 crc kubenswrapper[4475]: I1203 07:02:18.948181 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.191681 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.320095 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-config\") pod \"f5472efc-49e7-4b70-9084-da3a969617ab\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.320196 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-nb\") pod \"f5472efc-49e7-4b70-9084-da3a969617ab\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.320277 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-svc\") pod \"f5472efc-49e7-4b70-9084-da3a969617ab\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.320336 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-swift-storage-0\") pod \"f5472efc-49e7-4b70-9084-da3a969617ab\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.320358 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-sb\") pod \"f5472efc-49e7-4b70-9084-da3a969617ab\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.320398 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkjhh\" (UniqueName: \"kubernetes.io/projected/f5472efc-49e7-4b70-9084-da3a969617ab-kube-api-access-hkjhh\") pod \"f5472efc-49e7-4b70-9084-da3a969617ab\" (UID: \"f5472efc-49e7-4b70-9084-da3a969617ab\") " Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.326583 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5472efc-49e7-4b70-9084-da3a969617ab-kube-api-access-hkjhh" (OuterVolumeSpecName: "kube-api-access-hkjhh") pod "f5472efc-49e7-4b70-9084-da3a969617ab" (UID: "f5472efc-49e7-4b70-9084-da3a969617ab"). InnerVolumeSpecName "kube-api-access-hkjhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.370040 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5472efc-49e7-4b70-9084-da3a969617ab" (UID: "f5472efc-49e7-4b70-9084-da3a969617ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.382109 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-config" (OuterVolumeSpecName: "config") pod "f5472efc-49e7-4b70-9084-da3a969617ab" (UID: "f5472efc-49e7-4b70-9084-da3a969617ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.386658 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5472efc-49e7-4b70-9084-da3a969617ab" (UID: "f5472efc-49e7-4b70-9084-da3a969617ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.392707 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5472efc-49e7-4b70-9084-da3a969617ab" (UID: "f5472efc-49e7-4b70-9084-da3a969617ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.400629 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f5472efc-49e7-4b70-9084-da3a969617ab" (UID: "f5472efc-49e7-4b70-9084-da3a969617ab"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.422948 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.423001 4475 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.423013 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.423023 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkjhh\" (UniqueName: \"kubernetes.io/projected/f5472efc-49e7-4b70-9084-da3a969617ab-kube-api-access-hkjhh\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.423031 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.423038 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5472efc-49e7-4b70-9084-da3a969617ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.499529 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6e2659c-bdb8-4015-b749-9d1bfd70620f" path="/var/lib/kubelet/pods/e6e2659c-bdb8-4015-b749-9d1bfd70620f/volumes" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.689858 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664766cd5c-v6774" event={"ID":"f5472efc-49e7-4b70-9084-da3a969617ab","Type":"ContainerDied","Data":"dbeb829628985db70eb888fc8bf22ac76850d1122607e8e409c074ead2853120"} Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.689889 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664766cd5c-v6774" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.689908 4475 scope.go:117] "RemoveContainer" containerID="698e47c272c2b90493405777f693fd83df0b75195f388a0cc0e50059bbf82f9e" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.693796 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f","Type":"ContainerStarted","Data":"1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a"} Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.693838 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f","Type":"ContainerStarted","Data":"10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d"} Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.693852 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f","Type":"ContainerStarted","Data":"3686d18741a4ea840fb71a1c0e5e09f2f4caf415bcb91b221e6dbf6eb5a902c6"} Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.696132 4475 generic.go:334] "Generic (PLEG): container finished" podID="b51cbc3f-c89a-4e16-814c-381aa017a61f" containerID="339aebb2858f4681fbc7b76e2fb8459c2009b3951b391a0ec64a809d7d17f4ed" exitCode=0 Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.696189 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4x9fx" event={"ID":"b51cbc3f-c89a-4e16-814c-381aa017a61f","Type":"ContainerDied","Data":"339aebb2858f4681fbc7b76e2fb8459c2009b3951b391a0ec64a809d7d17f4ed"} Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.700623 4475 generic.go:334] "Generic (PLEG): container finished" podID="08a09e1c-191b-46b4-92d7-dd92fb839342" containerID="a7c74056fcf164a6b7666d7fe9f0e04744b03a9e45ac1c3b7f550c0b142ad819" exitCode=0 Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.700688 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5gv82" event={"ID":"08a09e1c-191b-46b4-92d7-dd92fb839342","Type":"ContainerDied","Data":"a7c74056fcf164a6b7666d7fe9f0e04744b03a9e45ac1c3b7f550c0b142ad819"} Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.714339 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.7143251149999998 podStartE2EDuration="1.714325115s" podCreationTimestamp="2025-12-03 07:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:19.71184524 +0000 UTC m=+1024.516743575" watchObservedRunningTime="2025-12-03 07:02:19.714325115 +0000 UTC m=+1024.519223449" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.733390 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-664766cd5c-v6774"] Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.735700 4475 scope.go:117] "RemoveContainer" containerID="7e56b591603e9abe007ee9aab5276272c4c13159114ac185674366f8e64a902a" Dec 03 07:02:19 crc kubenswrapper[4475]: I1203 07:02:19.747535 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-664766cd5c-v6774"] Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.078640 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.086729 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.155843 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-scripts\") pod \"b51cbc3f-c89a-4e16-814c-381aa017a61f\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.155933 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-config-data\") pod \"b51cbc3f-c89a-4e16-814c-381aa017a61f\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.155993 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-combined-ca-bundle\") pod \"b51cbc3f-c89a-4e16-814c-381aa017a61f\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.156068 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgrgw\" (UniqueName: \"kubernetes.io/projected/b51cbc3f-c89a-4e16-814c-381aa017a61f-kube-api-access-vgrgw\") pod \"b51cbc3f-c89a-4e16-814c-381aa017a61f\" (UID: \"b51cbc3f-c89a-4e16-814c-381aa017a61f\") " Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.160971 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b51cbc3f-c89a-4e16-814c-381aa017a61f-kube-api-access-vgrgw" (OuterVolumeSpecName: "kube-api-access-vgrgw") pod "b51cbc3f-c89a-4e16-814c-381aa017a61f" (UID: "b51cbc3f-c89a-4e16-814c-381aa017a61f"). InnerVolumeSpecName "kube-api-access-vgrgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.165291 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-scripts" (OuterVolumeSpecName: "scripts") pod "b51cbc3f-c89a-4e16-814c-381aa017a61f" (UID: "b51cbc3f-c89a-4e16-814c-381aa017a61f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.178366 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b51cbc3f-c89a-4e16-814c-381aa017a61f" (UID: "b51cbc3f-c89a-4e16-814c-381aa017a61f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.181577 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-config-data" (OuterVolumeSpecName: "config-data") pod "b51cbc3f-c89a-4e16-814c-381aa017a61f" (UID: "b51cbc3f-c89a-4e16-814c-381aa017a61f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.257985 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-config-data\") pod \"08a09e1c-191b-46b4-92d7-dd92fb839342\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.258183 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-scripts\") pod \"08a09e1c-191b-46b4-92d7-dd92fb839342\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.258800 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x48xr\" (UniqueName: \"kubernetes.io/projected/08a09e1c-191b-46b4-92d7-dd92fb839342-kube-api-access-x48xr\") pod \"08a09e1c-191b-46b4-92d7-dd92fb839342\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.259038 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-combined-ca-bundle\") pod \"08a09e1c-191b-46b4-92d7-dd92fb839342\" (UID: \"08a09e1c-191b-46b4-92d7-dd92fb839342\") " Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.259761 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgrgw\" (UniqueName: \"kubernetes.io/projected/b51cbc3f-c89a-4e16-814c-381aa017a61f-kube-api-access-vgrgw\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.259903 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.259982 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.260041 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51cbc3f-c89a-4e16-814c-381aa017a61f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.260225 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-scripts" (OuterVolumeSpecName: "scripts") pod "08a09e1c-191b-46b4-92d7-dd92fb839342" (UID: "08a09e1c-191b-46b4-92d7-dd92fb839342"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.261838 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a09e1c-191b-46b4-92d7-dd92fb839342-kube-api-access-x48xr" (OuterVolumeSpecName: "kube-api-access-x48xr") pod "08a09e1c-191b-46b4-92d7-dd92fb839342" (UID: "08a09e1c-191b-46b4-92d7-dd92fb839342"). InnerVolumeSpecName "kube-api-access-x48xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.280403 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-config-data" (OuterVolumeSpecName: "config-data") pod "08a09e1c-191b-46b4-92d7-dd92fb839342" (UID: "08a09e1c-191b-46b4-92d7-dd92fb839342"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.280805 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08a09e1c-191b-46b4-92d7-dd92fb839342" (UID: "08a09e1c-191b-46b4-92d7-dd92fb839342"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.361356 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x48xr\" (UniqueName: \"kubernetes.io/projected/08a09e1c-191b-46b4-92d7-dd92fb839342-kube-api-access-x48xr\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.361382 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.361393 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.361401 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a09e1c-191b-46b4-92d7-dd92fb839342-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.500554 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5472efc-49e7-4b70-9084-da3a969617ab" path="/var/lib/kubelet/pods/f5472efc-49e7-4b70-9084-da3a969617ab/volumes" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.716791 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4x9fx" event={"ID":"b51cbc3f-c89a-4e16-814c-381aa017a61f","Type":"ContainerDied","Data":"1847957421a5e817acf0766395a5e6eccad9efbcae1a69505f6f68ca3776d767"} Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.716834 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1847957421a5e817acf0766395a5e6eccad9efbcae1a69505f6f68ca3776d767" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.716889 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4x9fx" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.721121 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5gv82" event={"ID":"08a09e1c-191b-46b4-92d7-dd92fb839342","Type":"ContainerDied","Data":"6537775dc71d2cfe924826786bae0fa0522f0b9ab3ccb5ccd8dc4f04634eb5f7"} Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.721162 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6537775dc71d2cfe924826786bae0fa0522f0b9ab3ccb5ccd8dc4f04634eb5f7" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.721704 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5gv82" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.851906 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 07:02:21 crc kubenswrapper[4475]: E1203 07:02:21.852253 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a09e1c-191b-46b4-92d7-dd92fb839342" containerName="nova-manage" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.852271 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a09e1c-191b-46b4-92d7-dd92fb839342" containerName="nova-manage" Dec 03 07:02:21 crc kubenswrapper[4475]: E1203 07:02:21.852302 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51cbc3f-c89a-4e16-814c-381aa017a61f" containerName="nova-cell1-conductor-db-sync" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.852309 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51cbc3f-c89a-4e16-814c-381aa017a61f" containerName="nova-cell1-conductor-db-sync" Dec 03 07:02:21 crc kubenswrapper[4475]: E1203 07:02:21.852323 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5472efc-49e7-4b70-9084-da3a969617ab" containerName="init" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.852329 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5472efc-49e7-4b70-9084-da3a969617ab" containerName="init" Dec 03 07:02:21 crc kubenswrapper[4475]: E1203 07:02:21.852338 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5472efc-49e7-4b70-9084-da3a969617ab" containerName="dnsmasq-dns" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.852344 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5472efc-49e7-4b70-9084-da3a969617ab" containerName="dnsmasq-dns" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.852506 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a09e1c-191b-46b4-92d7-dd92fb839342" containerName="nova-manage" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.852520 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5472efc-49e7-4b70-9084-da3a969617ab" containerName="dnsmasq-dns" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.852531 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b51cbc3f-c89a-4e16-814c-381aa017a61f" containerName="nova-cell1-conductor-db-sync" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.853148 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.854848 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.860874 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.927439 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.931107 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1091a855-d763-4e97-aa03-66e64d9aae45" containerName="nova-api-api" containerID="cri-o://d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64" gracePeriod=30 Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.927652 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1091a855-d763-4e97-aa03-66e64d9aae45" containerName="nova-api-log" containerID="cri-o://90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0" gracePeriod=30 Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.941241 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.941442 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a9ae60aa-857f-48c8-80d0-a6f3d7490395" containerName="nova-scheduler-scheduler" containerID="cri-o://995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26" gracePeriod=30 Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.969658 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.969885 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" containerName="nova-metadata-log" containerID="cri-o://10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d" gracePeriod=30 Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.970387 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" containerName="nova-metadata-metadata" containerID="cri-o://1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a" gracePeriod=30 Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.975369 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18820a7a-d74d-4451-b927-a7b199d02185-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"18820a7a-d74d-4451-b927-a7b199d02185\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.975427 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18820a7a-d74d-4451-b927-a7b199d02185-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"18820a7a-d74d-4451-b927-a7b199d02185\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:21 crc kubenswrapper[4475]: I1203 07:02:21.975693 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw5hg\" (UniqueName: \"kubernetes.io/projected/18820a7a-d74d-4451-b927-a7b199d02185-kube-api-access-qw5hg\") pod \"nova-cell1-conductor-0\" (UID: \"18820a7a-d74d-4451-b927-a7b199d02185\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.081198 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18820a7a-d74d-4451-b927-a7b199d02185-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"18820a7a-d74d-4451-b927-a7b199d02185\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.081964 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw5hg\" (UniqueName: \"kubernetes.io/projected/18820a7a-d74d-4451-b927-a7b199d02185-kube-api-access-qw5hg\") pod \"nova-cell1-conductor-0\" (UID: \"18820a7a-d74d-4451-b927-a7b199d02185\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.082131 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18820a7a-d74d-4451-b927-a7b199d02185-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"18820a7a-d74d-4451-b927-a7b199d02185\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.092699 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18820a7a-d74d-4451-b927-a7b199d02185-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"18820a7a-d74d-4451-b927-a7b199d02185\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.097568 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18820a7a-d74d-4451-b927-a7b199d02185-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"18820a7a-d74d-4451-b927-a7b199d02185\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.106438 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw5hg\" (UniqueName: \"kubernetes.io/projected/18820a7a-d74d-4451-b927-a7b199d02185-kube-api-access-qw5hg\") pod \"nova-cell1-conductor-0\" (UID: \"18820a7a-d74d-4451-b927-a7b199d02185\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.169946 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.258350 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod206cd7f6_428b_4bcf_974c_74a1242401d1.slice/crio-c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472efc_49e7_4b70_9084_da3a969617ab.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1091a855_d763_4e97_aa03_66e64d9aae45.slice/crio-conmon-90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b47a8e8_a852_4c60_8ee0_a68fcdf6d91f.slice/crio-conmon-10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e2659c_bdb8_4015_b749_9d1bfd70620f.slice/crio-f38adfa30fd4ebb78cee694486062d19d33d13cf2c50659bb9a0ff66b6d44c52\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08a09e1c_191b_46b4_92d7_dd92fb839342.slice/crio-6537775dc71d2cfe924826786bae0fa0522f0b9ab3ccb5ccd8dc4f04634eb5f7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb51cbc3f_c89a_4e16_814c_381aa017a61f.slice/crio-1847957421a5e817acf0766395a5e6eccad9efbcae1a69505f6f68ca3776d767\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08a09e1c_191b_46b4_92d7_dd92fb839342.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb51cbc3f_c89a_4e16_814c_381aa017a61f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod206cd7f6_428b_4bcf_974c_74a1242401d1.slice/crio-conmon-c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1091a855_d763_4e97_aa03_66e64d9aae45.slice/crio-90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b47a8e8_a852_4c60_8ee0_a68fcdf6d91f.slice/crio-10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472efc_49e7_4b70_9084_da3a969617ab.slice/crio-dbeb829628985db70eb888fc8bf22ac76850d1122607e8e409c074ead2853120\": RecentStats: unable to find data in memory cache]" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.288555 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.405543 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7tlr\" (UniqueName: \"kubernetes.io/projected/206cd7f6-428b-4bcf-974c-74a1242401d1-kube-api-access-f7tlr\") pod \"206cd7f6-428b-4bcf-974c-74a1242401d1\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.405806 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-combined-ca-bundle\") pod \"206cd7f6-428b-4bcf-974c-74a1242401d1\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.406752 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-config-data\") pod \"206cd7f6-428b-4bcf-974c-74a1242401d1\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.406888 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-scripts\") pod \"206cd7f6-428b-4bcf-974c-74a1242401d1\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.406924 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-log-httpd\") pod \"206cd7f6-428b-4bcf-974c-74a1242401d1\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.407334 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-sg-core-conf-yaml\") pod \"206cd7f6-428b-4bcf-974c-74a1242401d1\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.407577 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "206cd7f6-428b-4bcf-974c-74a1242401d1" (UID: "206cd7f6-428b-4bcf-974c-74a1242401d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.407843 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-run-httpd\") pod \"206cd7f6-428b-4bcf-974c-74a1242401d1\" (UID: \"206cd7f6-428b-4bcf-974c-74a1242401d1\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.408143 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "206cd7f6-428b-4bcf-974c-74a1242401d1" (UID: "206cd7f6-428b-4bcf-974c-74a1242401d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.408842 4475 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.408858 4475 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206cd7f6-428b-4bcf-974c-74a1242401d1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.423843 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/206cd7f6-428b-4bcf-974c-74a1242401d1-kube-api-access-f7tlr" (OuterVolumeSpecName: "kube-api-access-f7tlr") pod "206cd7f6-428b-4bcf-974c-74a1242401d1" (UID: "206cd7f6-428b-4bcf-974c-74a1242401d1"). InnerVolumeSpecName "kube-api-access-f7tlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.434844 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-scripts" (OuterVolumeSpecName: "scripts") pod "206cd7f6-428b-4bcf-974c-74a1242401d1" (UID: "206cd7f6-428b-4bcf-974c-74a1242401d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.439836 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "206cd7f6-428b-4bcf-974c-74a1242401d1" (UID: "206cd7f6-428b-4bcf-974c-74a1242401d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.498769 4475 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.507584 4475 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.510989 4475 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.511029 4475 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a9ae60aa-857f-48c8-80d0-a6f3d7490395" containerName="nova-scheduler-scheduler" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.512132 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.514858 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7tlr\" (UniqueName: \"kubernetes.io/projected/206cd7f6-428b-4bcf-974c-74a1242401d1-kube-api-access-f7tlr\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.514873 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.514882 4475 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.516827 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "206cd7f6-428b-4bcf-974c-74a1242401d1" (UID: "206cd7f6-428b-4bcf-974c-74a1242401d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.552178 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-config-data" (OuterVolumeSpecName: "config-data") pod "206cd7f6-428b-4bcf-974c-74a1242401d1" (UID: "206cd7f6-428b-4bcf-974c-74a1242401d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.615937 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-config-data\") pod \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.616187 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-combined-ca-bundle\") pod \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.616341 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-logs\") pod \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.616494 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-nova-metadata-tls-certs\") pod \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.616787 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-logs" (OuterVolumeSpecName: "logs") pod "4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" (UID: "4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.617021 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4448\" (UniqueName: \"kubernetes.io/projected/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-kube-api-access-t4448\") pod \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\" (UID: \"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f\") " Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.618112 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.618198 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.618255 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/206cd7f6-428b-4bcf-974c-74a1242401d1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.620861 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-kube-api-access-t4448" (OuterVolumeSpecName: "kube-api-access-t4448") pod "4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" (UID: "4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f"). InnerVolumeSpecName "kube-api-access-t4448". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.636548 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" (UID: "4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.637293 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-config-data" (OuterVolumeSpecName: "config-data") pod "4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" (UID: "4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.653288 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" (UID: "4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:22 crc kubenswrapper[4475]: W1203 07:02:22.693765 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18820a7a_d74d_4451_b927_a7b199d02185.slice/crio-1716a2f448d4dd606dc80196ca20bf5429ff2d985fb135bc24169399d21fc5b7 WatchSource:0}: Error finding container 1716a2f448d4dd606dc80196ca20bf5429ff2d985fb135bc24169399d21fc5b7: Status 404 returned error can't find the container with id 1716a2f448d4dd606dc80196ca20bf5429ff2d985fb135bc24169399d21fc5b7 Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.702551 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.719703 4475 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.719817 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4448\" (UniqueName: \"kubernetes.io/projected/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-kube-api-access-t4448\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.719907 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.719983 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.729399 4475 generic.go:334] "Generic (PLEG): container finished" podID="1091a855-d763-4e97-aa03-66e64d9aae45" containerID="90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0" exitCode=143 Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.729477 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1091a855-d763-4e97-aa03-66e64d9aae45","Type":"ContainerDied","Data":"90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0"} Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.731364 4475 generic.go:334] "Generic (PLEG): container finished" podID="4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" containerID="1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a" exitCode=0 Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.731390 4475 generic.go:334] "Generic (PLEG): container finished" podID="4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" containerID="10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d" exitCode=143 Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.731424 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.731442 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f","Type":"ContainerDied","Data":"1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a"} Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.731486 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f","Type":"ContainerDied","Data":"10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d"} Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.731498 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f","Type":"ContainerDied","Data":"3686d18741a4ea840fb71a1c0e5e09f2f4caf415bcb91b221e6dbf6eb5a902c6"} Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.731513 4475 scope.go:117] "RemoveContainer" containerID="1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.733118 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"18820a7a-d74d-4451-b927-a7b199d02185","Type":"ContainerStarted","Data":"1716a2f448d4dd606dc80196ca20bf5429ff2d985fb135bc24169399d21fc5b7"} Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.736266 4475 generic.go:334] "Generic (PLEG): container finished" podID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerID="c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc" exitCode=0 Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.736324 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.736335 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206cd7f6-428b-4bcf-974c-74a1242401d1","Type":"ContainerDied","Data":"c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc"} Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.736753 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206cd7f6-428b-4bcf-974c-74a1242401d1","Type":"ContainerDied","Data":"e33bc16bcbac74b2e202fc4ca828cc5cc3ea8a2db3ade2b1e3cc5fd37068bd91"} Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.760191 4475 scope.go:117] "RemoveContainer" containerID="10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.781221 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.781663 4475 scope.go:117] "RemoveContainer" containerID="1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.782053 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a\": container with ID starting with 1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a not found: ID does not exist" containerID="1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.782099 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a"} err="failed to get container status \"1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a\": rpc error: code = NotFound desc = could not find container \"1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a\": container with ID starting with 1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a not found: ID does not exist" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.782122 4475 scope.go:117] "RemoveContainer" containerID="10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.782360 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d\": container with ID starting with 10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d not found: ID does not exist" containerID="10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.782381 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d"} err="failed to get container status \"10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d\": rpc error: code = NotFound desc = could not find container \"10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d\": container with ID starting with 10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d not found: ID does not exist" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.782394 4475 scope.go:117] "RemoveContainer" containerID="1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.783344 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a"} err="failed to get container status \"1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a\": rpc error: code = NotFound desc = could not find container \"1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a\": container with ID starting with 1cabd41ef15b2ea55b129b587118a4e4940c5cd22a5bb8d48071f082bade3f5a not found: ID does not exist" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.783376 4475 scope.go:117] "RemoveContainer" containerID="10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.784343 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d"} err="failed to get container status \"10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d\": rpc error: code = NotFound desc = could not find container \"10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d\": container with ID starting with 10ee170fc3e6750ec50a3854b3955a7387be665dce40e0def5f66bbf1675ce1d not found: ID does not exist" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.784371 4475 scope.go:117] "RemoveContainer" containerID="b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.793593 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.803640 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.807531 4475 scope.go:117] "RemoveContainer" containerID="d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.810596 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.817849 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.818293 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="proxy-httpd" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818310 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="proxy-httpd" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.818321 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" containerName="nova-metadata-log" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818328 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" containerName="nova-metadata-log" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.818340 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="ceilometer-central-agent" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818345 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="ceilometer-central-agent" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.818354 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="ceilometer-notification-agent" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818360 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="ceilometer-notification-agent" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.818372 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" containerName="nova-metadata-metadata" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818377 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" containerName="nova-metadata-metadata" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.818386 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="sg-core" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818393 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="sg-core" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818580 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" containerName="nova-metadata-metadata" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818593 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" containerName="nova-metadata-log" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818602 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="proxy-httpd" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818622 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="ceilometer-central-agent" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818629 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="sg-core" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.818636 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" containerName="ceilometer-notification-agent" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.820241 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.822244 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-run-httpd\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.822298 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-config-data\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.822316 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-scripts\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.822381 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.822442 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88lzf\" (UniqueName: \"kubernetes.io/projected/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-kube-api-access-88lzf\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.822630 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.822713 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-log-httpd\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.825311 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.825585 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.825741 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.844311 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.846188 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.846930 4475 scope.go:117] "RemoveContainer" containerID="acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.848547 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.851504 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.860112 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.897562 4475 scope.go:117] "RemoveContainer" containerID="c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.916950 4475 scope.go:117] "RemoveContainer" containerID="b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.919222 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24\": container with ID starting with b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24 not found: ID does not exist" containerID="b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.919263 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24"} err="failed to get container status \"b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24\": rpc error: code = NotFound desc = could not find container \"b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24\": container with ID starting with b456e80cebd3f1cb2834d881ccf2dc03f2fb2b8915ead4887890b643bcb93e24 not found: ID does not exist" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.919289 4475 scope.go:117] "RemoveContainer" containerID="d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.919641 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477\": container with ID starting with d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477 not found: ID does not exist" containerID="d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.919712 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477"} err="failed to get container status \"d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477\": rpc error: code = NotFound desc = could not find container \"d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477\": container with ID starting with d94e655325a244f34243cbd065ab9d070535b023df6273dc446a12b0e55c6477 not found: ID does not exist" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.919729 4475 scope.go:117] "RemoveContainer" containerID="acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.919959 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef\": container with ID starting with acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef not found: ID does not exist" containerID="acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.919981 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef"} err="failed to get container status \"acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef\": rpc error: code = NotFound desc = could not find container \"acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef\": container with ID starting with acef07e05c489d38552ce792a6380a673769d90e4eafe57e1b60f5a84f4c9cef not found: ID does not exist" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.919994 4475 scope.go:117] "RemoveContainer" containerID="c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc" Dec 03 07:02:22 crc kubenswrapper[4475]: E1203 07:02:22.920223 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc\": container with ID starting with c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc not found: ID does not exist" containerID="c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.920242 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc"} err="failed to get container status \"c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc\": rpc error: code = NotFound desc = could not find container \"c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc\": container with ID starting with c05b79d184f65249a34b88581d9bff9b8bd8091273e140fd3731cd82eaa389bc not found: ID does not exist" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924593 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-config-data\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924629 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-scripts\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924652 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924689 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88lzf\" (UniqueName: \"kubernetes.io/projected/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-kube-api-access-88lzf\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924745 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7nnd\" (UniqueName: \"kubernetes.io/projected/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-kube-api-access-d7nnd\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924768 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924785 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924813 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-log-httpd\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924837 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-config-data\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924859 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924879 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-logs\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.924897 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-run-httpd\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.925340 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-run-httpd\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.925610 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-log-httpd\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.929648 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-config-data\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.930252 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.930757 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.937597 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-scripts\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:22 crc kubenswrapper[4475]: I1203 07:02:22.939273 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88lzf\" (UniqueName: \"kubernetes.io/projected/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-kube-api-access-88lzf\") pod \"ceilometer-0\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " pod="openstack/ceilometer-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.026020 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7nnd\" (UniqueName: \"kubernetes.io/projected/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-kube-api-access-d7nnd\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.026409 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.026904 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-config-data\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.027051 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.027163 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-logs\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.027567 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-logs\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.030892 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.030977 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.031435 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-config-data\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.042921 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7nnd\" (UniqueName: \"kubernetes.io/projected/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-kube-api-access-d7nnd\") pod \"nova-metadata-0\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " pod="openstack/nova-metadata-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.142022 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.170217 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.503718 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="206cd7f6-428b-4bcf-974c-74a1242401d1" path="/var/lib/kubelet/pods/206cd7f6-428b-4bcf-974c-74a1242401d1/volumes" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.507271 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f" path="/var/lib/kubelet/pods/4b47a8e8-a852-4c60-8ee0-a68fcdf6d91f/volumes" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.559097 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.643475 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.746976 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"18820a7a-d74d-4451-b927-a7b199d02185","Type":"ContainerStarted","Data":"ad25d761adeb711d02ab24f1274af7f6f89938d27a04a6d92b65659732e20c10"} Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.747692 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.751016 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f","Type":"ContainerStarted","Data":"679a6d2d5e89a9c9a181117fe8aed189e021c9f36944c08c1de69710d2b6852e"} Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.752276 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407","Type":"ContainerStarted","Data":"2007442f7b5b1ad6ae76ece314a6829d4f71c5ddf18bc6f0a4975fba742f5542"} Dec 03 07:02:23 crc kubenswrapper[4475]: I1203 07:02:23.762408 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.762393285 podStartE2EDuration="2.762393285s" podCreationTimestamp="2025-12-03 07:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:23.760090944 +0000 UTC m=+1028.564989278" watchObservedRunningTime="2025-12-03 07:02:23.762393285 +0000 UTC m=+1028.567291619" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.392201 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.563179 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-config-data\") pod \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.563255 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kmmz\" (UniqueName: \"kubernetes.io/projected/a9ae60aa-857f-48c8-80d0-a6f3d7490395-kube-api-access-4kmmz\") pod \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.563423 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-combined-ca-bundle\") pod \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\" (UID: \"a9ae60aa-857f-48c8-80d0-a6f3d7490395\") " Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.576585 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ae60aa-857f-48c8-80d0-a6f3d7490395-kube-api-access-4kmmz" (OuterVolumeSpecName: "kube-api-access-4kmmz") pod "a9ae60aa-857f-48c8-80d0-a6f3d7490395" (UID: "a9ae60aa-857f-48c8-80d0-a6f3d7490395"). InnerVolumeSpecName "kube-api-access-4kmmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.583219 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9ae60aa-857f-48c8-80d0-a6f3d7490395" (UID: "a9ae60aa-857f-48c8-80d0-a6f3d7490395"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.591848 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-config-data" (OuterVolumeSpecName: "config-data") pod "a9ae60aa-857f-48c8-80d0-a6f3d7490395" (UID: "a9ae60aa-857f-48c8-80d0-a6f3d7490395"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.666124 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.667235 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ae60aa-857f-48c8-80d0-a6f3d7490395-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.667256 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kmmz\" (UniqueName: \"kubernetes.io/projected/a9ae60aa-857f-48c8-80d0-a6f3d7490395-kube-api-access-4kmmz\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.761872 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f","Type":"ContainerStarted","Data":"d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09"} Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.761912 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f","Type":"ContainerStarted","Data":"6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5"} Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.764277 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407","Type":"ContainerStarted","Data":"1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a"} Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.765382 4475 generic.go:334] "Generic (PLEG): container finished" podID="a9ae60aa-857f-48c8-80d0-a6f3d7490395" containerID="995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26" exitCode=0 Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.765966 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.768175 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a9ae60aa-857f-48c8-80d0-a6f3d7490395","Type":"ContainerDied","Data":"995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26"} Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.768207 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a9ae60aa-857f-48c8-80d0-a6f3d7490395","Type":"ContainerDied","Data":"37690707ca7b02167e57f21a4813bae1885193e97d71bb7f4e3c02367c2e4459"} Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.768226 4475 scope.go:117] "RemoveContainer" containerID="995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.790591 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.79055997 podStartE2EDuration="2.79055997s" podCreationTimestamp="2025-12-03 07:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:24.783419059 +0000 UTC m=+1029.588317393" watchObservedRunningTime="2025-12-03 07:02:24.79055997 +0000 UTC m=+1029.595458305" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.813192 4475 scope.go:117] "RemoveContainer" containerID="995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26" Dec 03 07:02:24 crc kubenswrapper[4475]: E1203 07:02:24.813685 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26\": container with ID starting with 995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26 not found: ID does not exist" containerID="995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.813739 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26"} err="failed to get container status \"995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26\": rpc error: code = NotFound desc = could not find container \"995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26\": container with ID starting with 995eba915ebc5b5247cf60ddb9a9798927d684ac44c522f057d5c36edfbdce26 not found: ID does not exist" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.818439 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.831355 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.837628 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:02:24 crc kubenswrapper[4475]: E1203 07:02:24.838071 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ae60aa-857f-48c8-80d0-a6f3d7490395" containerName="nova-scheduler-scheduler" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.838149 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ae60aa-857f-48c8-80d0-a6f3d7490395" containerName="nova-scheduler-scheduler" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.838321 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ae60aa-857f-48c8-80d0-a6f3d7490395" containerName="nova-scheduler-scheduler" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.838993 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.843239 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.847426 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.973278 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.973558 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-config-data\") pod \"nova-scheduler-0\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:24 crc kubenswrapper[4475]: I1203 07:02:24.973772 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx5wm\" (UniqueName: \"kubernetes.io/projected/daf693e8-1494-4da1-afdd-2cd6dbef665d-kube-api-access-xx5wm\") pod \"nova-scheduler-0\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.075317 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-config-data\") pod \"nova-scheduler-0\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.075405 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx5wm\" (UniqueName: \"kubernetes.io/projected/daf693e8-1494-4da1-afdd-2cd6dbef665d-kube-api-access-xx5wm\") pod \"nova-scheduler-0\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.075467 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.078871 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.079172 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-config-data\") pod \"nova-scheduler-0\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.093827 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx5wm\" (UniqueName: \"kubernetes.io/projected/daf693e8-1494-4da1-afdd-2cd6dbef665d-kube-api-access-xx5wm\") pod \"nova-scheduler-0\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " pod="openstack/nova-scheduler-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.159779 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.425954 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.485035 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-combined-ca-bundle\") pod \"1091a855-d763-4e97-aa03-66e64d9aae45\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.485131 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-config-data\") pod \"1091a855-d763-4e97-aa03-66e64d9aae45\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.485171 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw578\" (UniqueName: \"kubernetes.io/projected/1091a855-d763-4e97-aa03-66e64d9aae45-kube-api-access-tw578\") pod \"1091a855-d763-4e97-aa03-66e64d9aae45\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.485278 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1091a855-d763-4e97-aa03-66e64d9aae45-logs\") pod \"1091a855-d763-4e97-aa03-66e64d9aae45\" (UID: \"1091a855-d763-4e97-aa03-66e64d9aae45\") " Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.486277 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1091a855-d763-4e97-aa03-66e64d9aae45-logs" (OuterVolumeSpecName: "logs") pod "1091a855-d763-4e97-aa03-66e64d9aae45" (UID: "1091a855-d763-4e97-aa03-66e64d9aae45"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.493987 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1091a855-d763-4e97-aa03-66e64d9aae45-kube-api-access-tw578" (OuterVolumeSpecName: "kube-api-access-tw578") pod "1091a855-d763-4e97-aa03-66e64d9aae45" (UID: "1091a855-d763-4e97-aa03-66e64d9aae45"). InnerVolumeSpecName "kube-api-access-tw578". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.522764 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ae60aa-857f-48c8-80d0-a6f3d7490395" path="/var/lib/kubelet/pods/a9ae60aa-857f-48c8-80d0-a6f3d7490395/volumes" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.545134 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-config-data" (OuterVolumeSpecName: "config-data") pod "1091a855-d763-4e97-aa03-66e64d9aae45" (UID: "1091a855-d763-4e97-aa03-66e64d9aae45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.546019 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1091a855-d763-4e97-aa03-66e64d9aae45" (UID: "1091a855-d763-4e97-aa03-66e64d9aae45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.586728 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.586751 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1091a855-d763-4e97-aa03-66e64d9aae45-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.586761 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw578\" (UniqueName: \"kubernetes.io/projected/1091a855-d763-4e97-aa03-66e64d9aae45-kube-api-access-tw578\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.586769 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1091a855-d763-4e97-aa03-66e64d9aae45-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.608754 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.779747 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"daf693e8-1494-4da1-afdd-2cd6dbef665d","Type":"ContainerStarted","Data":"d81f73ccbee54b25fda537d67990f2d5c792ef159a8ca28d206b23c7f3a014e4"} Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.784445 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407","Type":"ContainerStarted","Data":"5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23"} Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.787220 4475 generic.go:334] "Generic (PLEG): container finished" podID="1091a855-d763-4e97-aa03-66e64d9aae45" containerID="d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64" exitCode=0 Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.787300 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.787320 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1091a855-d763-4e97-aa03-66e64d9aae45","Type":"ContainerDied","Data":"d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64"} Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.787840 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1091a855-d763-4e97-aa03-66e64d9aae45","Type":"ContainerDied","Data":"97462c3210d5b4ab38efb70aee484825f6072b81eeaaf5d7ed794aa5cbcd36b5"} Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.787927 4475 scope.go:117] "RemoveContainer" containerID="d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.819268 4475 scope.go:117] "RemoveContainer" containerID="90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.824175 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.834292 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.847979 4475 scope.go:117] "RemoveContainer" containerID="d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64" Dec 03 07:02:25 crc kubenswrapper[4475]: E1203 07:02:25.848351 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64\": container with ID starting with d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64 not found: ID does not exist" containerID="d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.848435 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64"} err="failed to get container status \"d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64\": rpc error: code = NotFound desc = could not find container \"d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64\": container with ID starting with d04fdf66539d5824449205434c334970601b18a7c9162d74555c064584188c64 not found: ID does not exist" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.848539 4475 scope.go:117] "RemoveContainer" containerID="90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0" Dec 03 07:02:25 crc kubenswrapper[4475]: E1203 07:02:25.849717 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0\": container with ID starting with 90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0 not found: ID does not exist" containerID="90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.849809 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0"} err="failed to get container status \"90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0\": rpc error: code = NotFound desc = could not find container \"90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0\": container with ID starting with 90fd3222b68882aad0e8976a95f5831204b3e58a81842c1a8c4e1184ad0ccdf0 not found: ID does not exist" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.856035 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:25 crc kubenswrapper[4475]: E1203 07:02:25.856481 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1091a855-d763-4e97-aa03-66e64d9aae45" containerName="nova-api-api" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.856563 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="1091a855-d763-4e97-aa03-66e64d9aae45" containerName="nova-api-api" Dec 03 07:02:25 crc kubenswrapper[4475]: E1203 07:02:25.856650 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1091a855-d763-4e97-aa03-66e64d9aae45" containerName="nova-api-log" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.856710 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="1091a855-d763-4e97-aa03-66e64d9aae45" containerName="nova-api-log" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.856922 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="1091a855-d763-4e97-aa03-66e64d9aae45" containerName="nova-api-log" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.856982 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="1091a855-d763-4e97-aa03-66e64d9aae45" containerName="nova-api-api" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.858070 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.861220 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.875590 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.892650 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-config-data\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.892704 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.892727 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da67b3db-e99c-4011-8071-0488afd0ab23-logs\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.892805 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxkl6\" (UniqueName: \"kubernetes.io/projected/da67b3db-e99c-4011-8071-0488afd0ab23-kube-api-access-jxkl6\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.993743 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkl6\" (UniqueName: \"kubernetes.io/projected/da67b3db-e99c-4011-8071-0488afd0ab23-kube-api-access-jxkl6\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.993844 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-config-data\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.993879 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.993899 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da67b3db-e99c-4011-8071-0488afd0ab23-logs\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.994254 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da67b3db-e99c-4011-8071-0488afd0ab23-logs\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.996972 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:25 crc kubenswrapper[4475]: I1203 07:02:25.997605 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-config-data\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:26 crc kubenswrapper[4475]: I1203 07:02:26.007881 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxkl6\" (UniqueName: \"kubernetes.io/projected/da67b3db-e99c-4011-8071-0488afd0ab23-kube-api-access-jxkl6\") pod \"nova-api-0\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " pod="openstack/nova-api-0" Dec 03 07:02:26 crc kubenswrapper[4475]: I1203 07:02:26.172039 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:02:26 crc kubenswrapper[4475]: I1203 07:02:26.603272 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:26 crc kubenswrapper[4475]: W1203 07:02:26.611036 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda67b3db_e99c_4011_8071_0488afd0ab23.slice/crio-5ea1bf8e510b3b8e920bbf52c0b3e6f788add4fc8c0127884ff459dd680498a3 WatchSource:0}: Error finding container 5ea1bf8e510b3b8e920bbf52c0b3e6f788add4fc8c0127884ff459dd680498a3: Status 404 returned error can't find the container with id 5ea1bf8e510b3b8e920bbf52c0b3e6f788add4fc8c0127884ff459dd680498a3 Dec 03 07:02:26 crc kubenswrapper[4475]: I1203 07:02:26.805727 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da67b3db-e99c-4011-8071-0488afd0ab23","Type":"ContainerStarted","Data":"a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b"} Dec 03 07:02:26 crc kubenswrapper[4475]: I1203 07:02:26.805907 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da67b3db-e99c-4011-8071-0488afd0ab23","Type":"ContainerStarted","Data":"5ea1bf8e510b3b8e920bbf52c0b3e6f788add4fc8c0127884ff459dd680498a3"} Dec 03 07:02:26 crc kubenswrapper[4475]: I1203 07:02:26.807839 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407","Type":"ContainerStarted","Data":"242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887"} Dec 03 07:02:26 crc kubenswrapper[4475]: I1203 07:02:26.809090 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"daf693e8-1494-4da1-afdd-2cd6dbef665d","Type":"ContainerStarted","Data":"72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882"} Dec 03 07:02:26 crc kubenswrapper[4475]: I1203 07:02:26.827541 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.827522245 podStartE2EDuration="2.827522245s" podCreationTimestamp="2025-12-03 07:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:26.824571704 +0000 UTC m=+1031.629470038" watchObservedRunningTime="2025-12-03 07:02:26.827522245 +0000 UTC m=+1031.632420578" Dec 03 07:02:27 crc kubenswrapper[4475]: I1203 07:02:27.191863 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 07:02:27 crc kubenswrapper[4475]: I1203 07:02:27.499850 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1091a855-d763-4e97-aa03-66e64d9aae45" path="/var/lib/kubelet/pods/1091a855-d763-4e97-aa03-66e64d9aae45/volumes" Dec 03 07:02:27 crc kubenswrapper[4475]: I1203 07:02:27.817540 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407","Type":"ContainerStarted","Data":"050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216"} Dec 03 07:02:27 crc kubenswrapper[4475]: I1203 07:02:27.818301 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:02:27 crc kubenswrapper[4475]: I1203 07:02:27.821578 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da67b3db-e99c-4011-8071-0488afd0ab23","Type":"ContainerStarted","Data":"b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7"} Dec 03 07:02:27 crc kubenswrapper[4475]: I1203 07:02:27.840598 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.119905973 podStartE2EDuration="5.840580941s" podCreationTimestamp="2025-12-03 07:02:22 +0000 UTC" firstStartedPulling="2025-12-03 07:02:23.562286376 +0000 UTC m=+1028.367184710" lastFinishedPulling="2025-12-03 07:02:27.282961345 +0000 UTC m=+1032.087859678" observedRunningTime="2025-12-03 07:02:27.838331029 +0000 UTC m=+1032.643229373" watchObservedRunningTime="2025-12-03 07:02:27.840580941 +0000 UTC m=+1032.645479275" Dec 03 07:02:27 crc kubenswrapper[4475]: I1203 07:02:27.859044 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.859030686 podStartE2EDuration="2.859030686s" podCreationTimestamp="2025-12-03 07:02:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:27.852923327 +0000 UTC m=+1032.657821661" watchObservedRunningTime="2025-12-03 07:02:27.859030686 +0000 UTC m=+1032.663929020" Dec 03 07:02:28 crc kubenswrapper[4475]: I1203 07:02:28.170902 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:02:28 crc kubenswrapper[4475]: I1203 07:02:28.172030 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:02:28 crc kubenswrapper[4475]: I1203 07:02:28.933345 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:02:28 crc kubenswrapper[4475]: I1203 07:02:28.933590 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:02:30 crc kubenswrapper[4475]: I1203 07:02:30.163018 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 07:02:32 crc kubenswrapper[4475]: E1203 07:02:32.468826 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e2659c_bdb8_4015_b749_9d1bfd70620f.slice/crio-f38adfa30fd4ebb78cee694486062d19d33d13cf2c50659bb9a0ff66b6d44c52\": RecentStats: unable to find data in memory cache]" Dec 03 07:02:33 crc kubenswrapper[4475]: I1203 07:02:33.170588 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 07:02:33 crc kubenswrapper[4475]: I1203 07:02:33.170644 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 07:02:34 crc kubenswrapper[4475]: I1203 07:02:34.425615 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 07:02:34 crc kubenswrapper[4475]: I1203 07:02:34.425608 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 07:02:35 crc kubenswrapper[4475]: I1203 07:02:35.160827 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 07:02:35 crc kubenswrapper[4475]: I1203 07:02:35.188105 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 07:02:35 crc kubenswrapper[4475]: I1203 07:02:35.915268 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 07:02:36 crc kubenswrapper[4475]: I1203 07:02:36.172393 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:02:36 crc kubenswrapper[4475]: I1203 07:02:36.172439 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:02:37 crc kubenswrapper[4475]: I1203 07:02:37.256637 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="da67b3db-e99c-4011-8071-0488afd0ab23" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:02:37 crc kubenswrapper[4475]: I1203 07:02:37.258281 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="da67b3db-e99c-4011-8071-0488afd0ab23" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:02:42 crc kubenswrapper[4475]: E1203 07:02:42.671843 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e2659c_bdb8_4015_b749_9d1bfd70620f.slice/crio-f38adfa30fd4ebb78cee694486062d19d33d13cf2c50659bb9a0ff66b6d44c52\": RecentStats: unable to find data in memory cache]" Dec 03 07:02:43 crc kubenswrapper[4475]: I1203 07:02:43.174599 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 07:02:43 crc kubenswrapper[4475]: I1203 07:02:43.176278 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 07:02:43 crc kubenswrapper[4475]: I1203 07:02:43.178735 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 07:02:43 crc kubenswrapper[4475]: I1203 07:02:43.951123 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 07:02:45 crc kubenswrapper[4475]: I1203 07:02:45.961999 4475 generic.go:334] "Generic (PLEG): container finished" podID="52e2f226-298b-4a85-98a5-abbb1a72320f" containerID="2ed38d76695ea0fe62cd693f8f4977e088e10bb5abd97b20bb2c778f25c5dcec" exitCode=137 Dec 03 07:02:45 crc kubenswrapper[4475]: I1203 07:02:45.962080 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52e2f226-298b-4a85-98a5-abbb1a72320f","Type":"ContainerDied","Data":"2ed38d76695ea0fe62cd693f8f4977e088e10bb5abd97b20bb2c778f25c5dcec"} Dec 03 07:02:45 crc kubenswrapper[4475]: I1203 07:02:45.962508 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52e2f226-298b-4a85-98a5-abbb1a72320f","Type":"ContainerDied","Data":"2974e92edfb0b79f84a191e3abd2e03c1824eb064edbd93f454abd7fe11ff3ba"} Dec 03 07:02:45 crc kubenswrapper[4475]: I1203 07:02:45.962528 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2974e92edfb0b79f84a191e3abd2e03c1824eb064edbd93f454abd7fe11ff3ba" Dec 03 07:02:45 crc kubenswrapper[4475]: I1203 07:02:45.969306 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.141161 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-config-data\") pod \"52e2f226-298b-4a85-98a5-abbb1a72320f\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.141294 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-combined-ca-bundle\") pod \"52e2f226-298b-4a85-98a5-abbb1a72320f\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.141314 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wzbf\" (UniqueName: \"kubernetes.io/projected/52e2f226-298b-4a85-98a5-abbb1a72320f-kube-api-access-8wzbf\") pod \"52e2f226-298b-4a85-98a5-abbb1a72320f\" (UID: \"52e2f226-298b-4a85-98a5-abbb1a72320f\") " Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.148643 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e2f226-298b-4a85-98a5-abbb1a72320f-kube-api-access-8wzbf" (OuterVolumeSpecName: "kube-api-access-8wzbf") pod "52e2f226-298b-4a85-98a5-abbb1a72320f" (UID: "52e2f226-298b-4a85-98a5-abbb1a72320f"). InnerVolumeSpecName "kube-api-access-8wzbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.164790 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52e2f226-298b-4a85-98a5-abbb1a72320f" (UID: "52e2f226-298b-4a85-98a5-abbb1a72320f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.167595 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-config-data" (OuterVolumeSpecName: "config-data") pod "52e2f226-298b-4a85-98a5-abbb1a72320f" (UID: "52e2f226-298b-4a85-98a5-abbb1a72320f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.177942 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.178364 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.180435 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.181784 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.243843 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.243872 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e2f226-298b-4a85-98a5-abbb1a72320f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.243911 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wzbf\" (UniqueName: \"kubernetes.io/projected/52e2f226-298b-4a85-98a5-abbb1a72320f-kube-api-access-8wzbf\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.968871 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.969280 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 07:02:46 crc kubenswrapper[4475]: I1203 07:02:46.972651 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.007175 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.014584 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.022927 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:02:47 crc kubenswrapper[4475]: E1203 07:02:47.023296 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e2f226-298b-4a85-98a5-abbb1a72320f" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.023309 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e2f226-298b-4a85-98a5-abbb1a72320f" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.023744 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e2f226-298b-4a85-98a5-abbb1a72320f" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.024333 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.027427 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.027696 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.029096 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.035010 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.154857 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57bcd77bb7-6bvxl"] Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.156548 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.158789 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kffsj\" (UniqueName: \"kubernetes.io/projected/324dfdb8-987b-4a87-a2ca-ab044deec5bc-kube-api-access-kffsj\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.158839 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.158861 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.158885 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-config\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.158902 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p576g\" (UniqueName: \"kubernetes.io/projected/327cade9-5417-4841-8984-eb98c916f0c1-kube-api-access-p576g\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.158917 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-swift-storage-0\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.158955 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-svc\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.158982 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-nb\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.159048 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.159077 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.159110 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-sb\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.184402 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bcd77bb7-6bvxl"] Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.266335 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kffsj\" (UniqueName: \"kubernetes.io/projected/324dfdb8-987b-4a87-a2ca-ab044deec5bc-kube-api-access-kffsj\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.266393 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.266424 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.266480 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-config\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.266499 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p576g\" (UniqueName: \"kubernetes.io/projected/327cade9-5417-4841-8984-eb98c916f0c1-kube-api-access-p576g\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.266522 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-swift-storage-0\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.266598 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-svc\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.266617 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-nb\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.266730 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.266774 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.266810 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-sb\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.267630 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-sb\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.268943 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-nb\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.268982 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-swift-storage-0\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.269475 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-svc\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.272381 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.272731 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-config\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.272960 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.282772 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.286748 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/324dfdb8-987b-4a87-a2ca-ab044deec5bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.288583 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kffsj\" (UniqueName: \"kubernetes.io/projected/324dfdb8-987b-4a87-a2ca-ab044deec5bc-kube-api-access-kffsj\") pod \"nova-cell1-novncproxy-0\" (UID: \"324dfdb8-987b-4a87-a2ca-ab044deec5bc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.296505 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p576g\" (UniqueName: \"kubernetes.io/projected/327cade9-5417-4841-8984-eb98c916f0c1-kube-api-access-p576g\") pod \"dnsmasq-dns-57bcd77bb7-6bvxl\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.353301 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.484439 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.553956 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e2f226-298b-4a85-98a5-abbb1a72320f" path="/var/lib/kubelet/pods/52e2f226-298b-4a85-98a5-abbb1a72320f/volumes" Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.795150 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.976873 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"324dfdb8-987b-4a87-a2ca-ab044deec5bc","Type":"ContainerStarted","Data":"b45fa9fcb8dda96cb6153a87fbc8cd81e22f09a056f0a16d8a21246439fb5cab"} Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.977044 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"324dfdb8-987b-4a87-a2ca-ab044deec5bc","Type":"ContainerStarted","Data":"1db79ce98462b54e3952c6fd2ed817cf21b9b07ddb214b96b8ec6f3c08de1596"} Dec 03 07:02:47 crc kubenswrapper[4475]: I1203 07:02:47.986189 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bcd77bb7-6bvxl"] Dec 03 07:02:48 crc kubenswrapper[4475]: I1203 07:02:48.001672 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.001656281 podStartE2EDuration="1.001656281s" podCreationTimestamp="2025-12-03 07:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:47.997059033 +0000 UTC m=+1052.801957367" watchObservedRunningTime="2025-12-03 07:02:48.001656281 +0000 UTC m=+1052.806554625" Dec 03 07:02:48 crc kubenswrapper[4475]: I1203 07:02:48.986233 4475 generic.go:334] "Generic (PLEG): container finished" podID="327cade9-5417-4841-8984-eb98c916f0c1" containerID="0b635505b253c154fad2b8e4a664f2f5302acc860e0c6b6459fbb74f33bc8bdb" exitCode=0 Dec 03 07:02:48 crc kubenswrapper[4475]: I1203 07:02:48.986338 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" event={"ID":"327cade9-5417-4841-8984-eb98c916f0c1","Type":"ContainerDied","Data":"0b635505b253c154fad2b8e4a664f2f5302acc860e0c6b6459fbb74f33bc8bdb"} Dec 03 07:02:48 crc kubenswrapper[4475]: I1203 07:02:48.986509 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" event={"ID":"327cade9-5417-4841-8984-eb98c916f0c1","Type":"ContainerStarted","Data":"fbe464ee9522c2ec52ae8a51376ce558ee38db66045734d02b4bc32d2adf6324"} Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.314867 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.420655 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.420938 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="ceilometer-central-agent" containerID="cri-o://1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a" gracePeriod=30 Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.421008 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="proxy-httpd" containerID="cri-o://050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216" gracePeriod=30 Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.421131 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="sg-core" containerID="cri-o://242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887" gracePeriod=30 Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.421183 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="ceilometer-notification-agent" containerID="cri-o://5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23" gracePeriod=30 Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.432210 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.206:3000/\": EOF" Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.997147 4475 generic.go:334] "Generic (PLEG): container finished" podID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerID="050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216" exitCode=0 Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.997365 4475 generic.go:334] "Generic (PLEG): container finished" podID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerID="242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887" exitCode=2 Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.997374 4475 generic.go:334] "Generic (PLEG): container finished" podID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerID="1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a" exitCode=0 Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.997215 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407","Type":"ContainerDied","Data":"050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216"} Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.997429 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407","Type":"ContainerDied","Data":"242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887"} Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.997444 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407","Type":"ContainerDied","Data":"1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a"} Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.999645 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" event={"ID":"327cade9-5417-4841-8984-eb98c916f0c1","Type":"ContainerStarted","Data":"7d8962a7a160f903125c5e0d0964f3c4f8df7cc02ae8230371b00a875a168f36"} Dec 03 07:02:49 crc kubenswrapper[4475]: I1203 07:02:49.999822 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="da67b3db-e99c-4011-8071-0488afd0ab23" containerName="nova-api-api" containerID="cri-o://b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7" gracePeriod=30 Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.000047 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="da67b3db-e99c-4011-8071-0488afd0ab23" containerName="nova-api-log" containerID="cri-o://a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b" gracePeriod=30 Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.020392 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" podStartSLOduration=3.020378083 podStartE2EDuration="3.020378083s" podCreationTimestamp="2025-12-03 07:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:50.015364462 +0000 UTC m=+1054.820262806" watchObservedRunningTime="2025-12-03 07:02:50.020378083 +0000 UTC m=+1054.825276417" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.659637 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.844203 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-log-httpd\") pod \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.844295 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-scripts\") pod \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.844665 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" (UID: "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.844401 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-run-httpd\") pod \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.844754 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-config-data\") pod \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.844733 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" (UID: "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.844783 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-sg-core-conf-yaml\") pod \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.844840 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88lzf\" (UniqueName: \"kubernetes.io/projected/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-kube-api-access-88lzf\") pod \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.845220 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-combined-ca-bundle\") pod \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\" (UID: \"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407\") " Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.845618 4475 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.845635 4475 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.851155 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-kube-api-access-88lzf" (OuterVolumeSpecName: "kube-api-access-88lzf") pod "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" (UID: "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407"). InnerVolumeSpecName "kube-api-access-88lzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.855561 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-scripts" (OuterVolumeSpecName: "scripts") pod "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" (UID: "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.867979 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" (UID: "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.929698 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-config-data" (OuterVolumeSpecName: "config-data") pod "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" (UID: "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.934653 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" (UID: "a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.947083 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.947120 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.947129 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.947160 4475 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:50 crc kubenswrapper[4475]: I1203 07:02:50.947170 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88lzf\" (UniqueName: \"kubernetes.io/projected/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407-kube-api-access-88lzf\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.007911 4475 generic.go:334] "Generic (PLEG): container finished" podID="da67b3db-e99c-4011-8071-0488afd0ab23" containerID="a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b" exitCode=143 Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.007969 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da67b3db-e99c-4011-8071-0488afd0ab23","Type":"ContainerDied","Data":"a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b"} Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.009679 4475 generic.go:334] "Generic (PLEG): container finished" podID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerID="5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23" exitCode=0 Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.010547 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.010558 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407","Type":"ContainerDied","Data":"5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23"} Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.010780 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.010853 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407","Type":"ContainerDied","Data":"2007442f7b5b1ad6ae76ece314a6829d4f71c5ddf18bc6f0a4975fba742f5542"} Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.010852 4475 scope.go:117] "RemoveContainer" containerID="050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.027904 4475 scope.go:117] "RemoveContainer" containerID="242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.040906 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.047096 4475 scope.go:117] "RemoveContainer" containerID="5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.068312 4475 scope.go:117] "RemoveContainer" containerID="1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.087822 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.111497 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:51 crc kubenswrapper[4475]: E1203 07:02:51.113441 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="ceilometer-notification-agent" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.113477 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="ceilometer-notification-agent" Dec 03 07:02:51 crc kubenswrapper[4475]: E1203 07:02:51.113507 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="proxy-httpd" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.113515 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="proxy-httpd" Dec 03 07:02:51 crc kubenswrapper[4475]: E1203 07:02:51.113881 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="sg-core" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.113898 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="sg-core" Dec 03 07:02:51 crc kubenswrapper[4475]: E1203 07:02:51.113929 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="ceilometer-central-agent" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.113936 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="ceilometer-central-agent" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.114588 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="proxy-httpd" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.114609 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="sg-core" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.114649 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="ceilometer-notification-agent" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.114664 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" containerName="ceilometer-central-agent" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.114808 4475 scope.go:117] "RemoveContainer" containerID="050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216" Dec 03 07:02:51 crc kubenswrapper[4475]: E1203 07:02:51.117632 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216\": container with ID starting with 050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216 not found: ID does not exist" containerID="050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.117675 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216"} err="failed to get container status \"050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216\": rpc error: code = NotFound desc = could not find container \"050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216\": container with ID starting with 050203aa2a2be1328b0ccbec346783373ed0bc8d88f7a10b3737a95fbe4b4216 not found: ID does not exist" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.117703 4475 scope.go:117] "RemoveContainer" containerID="242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.120717 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: E1203 07:02:51.121369 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887\": container with ID starting with 242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887 not found: ID does not exist" containerID="242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.121398 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887"} err="failed to get container status \"242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887\": rpc error: code = NotFound desc = could not find container \"242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887\": container with ID starting with 242388a9400bf5f290f7b6516d79f67e63d8af0a2c86e7bd7591850d56f47887 not found: ID does not exist" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.121421 4475 scope.go:117] "RemoveContainer" containerID="5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.123553 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.125830 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:02:51 crc kubenswrapper[4475]: E1203 07:02:51.128712 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23\": container with ID starting with 5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23 not found: ID does not exist" containerID="5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.128739 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23"} err="failed to get container status \"5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23\": rpc error: code = NotFound desc = could not find container \"5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23\": container with ID starting with 5361a2f551d6debb410f56107ce30cec6539acd8ae8951949d9e9fb080244c23 not found: ID does not exist" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.128767 4475 scope.go:117] "RemoveContainer" containerID="1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a" Dec 03 07:02:51 crc kubenswrapper[4475]: E1203 07:02:51.129991 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a\": container with ID starting with 1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a not found: ID does not exist" containerID="1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.130019 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a"} err="failed to get container status \"1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a\": rpc error: code = NotFound desc = could not find container \"1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a\": container with ID starting with 1c44707148557c46d7f9b0d05553f381cdc59ec68eb4d95e31e2dae4d9d22b3a not found: ID does not exist" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.145631 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.255414 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-log-httpd\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.255513 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbzns\" (UniqueName: \"kubernetes.io/projected/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-kube-api-access-bbzns\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.255543 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.255610 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-run-httpd\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.255674 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-scripts\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.255717 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.255752 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-config-data\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.331242 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:51 crc kubenswrapper[4475]: E1203 07:02:51.332045 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-bbzns log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="5cd83272-b3cd-4a9e-b741-1d2e83d8416c" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.357879 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-log-httpd\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.358025 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbzns\" (UniqueName: \"kubernetes.io/projected/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-kube-api-access-bbzns\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.358069 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.358088 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-run-httpd\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.358172 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-scripts\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.358203 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.358261 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-config-data\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.358498 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-log-httpd\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.358699 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-run-httpd\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.364277 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.364433 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-scripts\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.364754 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-config-data\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.365363 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.372360 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbzns\" (UniqueName: \"kubernetes.io/projected/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-kube-api-access-bbzns\") pod \"ceilometer-0\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " pod="openstack/ceilometer-0" Dec 03 07:02:51 crc kubenswrapper[4475]: I1203 07:02:51.499971 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407" path="/var/lib/kubelet/pods/a00cc5ea-a8e3-4936-b9cc-dc1bd20b7407/volumes" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.018621 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.029947 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.180486 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-combined-ca-bundle\") pod \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.180568 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-config-data\") pod \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.180603 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-sg-core-conf-yaml\") pod \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.180672 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-log-httpd\") pod \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.180843 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbzns\" (UniqueName: \"kubernetes.io/projected/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-kube-api-access-bbzns\") pod \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.180880 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-scripts\") pod \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.180941 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-run-httpd\") pod \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\" (UID: \"5cd83272-b3cd-4a9e-b741-1d2e83d8416c\") " Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.181284 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5cd83272-b3cd-4a9e-b741-1d2e83d8416c" (UID: "5cd83272-b3cd-4a9e-b741-1d2e83d8416c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.181637 4475 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.184147 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5cd83272-b3cd-4a9e-b741-1d2e83d8416c" (UID: "5cd83272-b3cd-4a9e-b741-1d2e83d8416c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.185568 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cd83272-b3cd-4a9e-b741-1d2e83d8416c" (UID: "5cd83272-b3cd-4a9e-b741-1d2e83d8416c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.186321 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-config-data" (OuterVolumeSpecName: "config-data") pod "5cd83272-b3cd-4a9e-b741-1d2e83d8416c" (UID: "5cd83272-b3cd-4a9e-b741-1d2e83d8416c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.186898 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-kube-api-access-bbzns" (OuterVolumeSpecName: "kube-api-access-bbzns") pod "5cd83272-b3cd-4a9e-b741-1d2e83d8416c" (UID: "5cd83272-b3cd-4a9e-b741-1d2e83d8416c"). InnerVolumeSpecName "kube-api-access-bbzns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.188979 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-scripts" (OuterVolumeSpecName: "scripts") pod "5cd83272-b3cd-4a9e-b741-1d2e83d8416c" (UID: "5cd83272-b3cd-4a9e-b741-1d2e83d8416c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.199900 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5cd83272-b3cd-4a9e-b741-1d2e83d8416c" (UID: "5cd83272-b3cd-4a9e-b741-1d2e83d8416c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.283500 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.283765 4475 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.283778 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbzns\" (UniqueName: \"kubernetes.io/projected/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-kube-api-access-bbzns\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.283787 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.283795 4475 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.283802 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd83272-b3cd-4a9e-b741-1d2e83d8416c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:52 crc kubenswrapper[4475]: I1203 07:02:52.354343 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:52 crc kubenswrapper[4475]: E1203 07:02:52.880840 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e2659c_bdb8_4015_b749_9d1bfd70620f.slice/crio-f38adfa30fd4ebb78cee694486062d19d33d13cf2c50659bb9a0ff66b6d44c52\": RecentStats: unable to find data in memory cache]" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.026297 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.079242 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.112552 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.125681 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.128012 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.129972 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.136332 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.137315 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.202307 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.202360 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-log-httpd\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.202390 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-run-httpd\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.202420 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.202514 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjsf\" (UniqueName: \"kubernetes.io/projected/7f520204-42e9-45b7-ad44-a659f8c42b74-kube-api-access-ktjsf\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.202554 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-scripts\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.202600 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-config-data\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.304985 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjsf\" (UniqueName: \"kubernetes.io/projected/7f520204-42e9-45b7-ad44-a659f8c42b74-kube-api-access-ktjsf\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.305366 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-scripts\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.305416 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-config-data\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.305446 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.305488 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-log-httpd\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.305513 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-run-httpd\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.305547 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.307823 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-log-httpd\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.313575 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-config-data\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.313810 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-run-httpd\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.314951 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.318068 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.323983 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjsf\" (UniqueName: \"kubernetes.io/projected/7f520204-42e9-45b7-ad44-a659f8c42b74-kube-api-access-ktjsf\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.330463 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-scripts\") pod \"ceilometer-0\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.512945 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.516151 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cd83272-b3cd-4a9e-b741-1d2e83d8416c" path="/var/lib/kubelet/pods/5cd83272-b3cd-4a9e-b741-1d2e83d8416c/volumes" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.552974 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.716303 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-combined-ca-bundle\") pod \"da67b3db-e99c-4011-8071-0488afd0ab23\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.716618 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-config-data\") pod \"da67b3db-e99c-4011-8071-0488afd0ab23\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.716814 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxkl6\" (UniqueName: \"kubernetes.io/projected/da67b3db-e99c-4011-8071-0488afd0ab23-kube-api-access-jxkl6\") pod \"da67b3db-e99c-4011-8071-0488afd0ab23\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.716928 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da67b3db-e99c-4011-8071-0488afd0ab23-logs\") pod \"da67b3db-e99c-4011-8071-0488afd0ab23\" (UID: \"da67b3db-e99c-4011-8071-0488afd0ab23\") " Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.717657 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da67b3db-e99c-4011-8071-0488afd0ab23-logs" (OuterVolumeSpecName: "logs") pod "da67b3db-e99c-4011-8071-0488afd0ab23" (UID: "da67b3db-e99c-4011-8071-0488afd0ab23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.732359 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da67b3db-e99c-4011-8071-0488afd0ab23-kube-api-access-jxkl6" (OuterVolumeSpecName: "kube-api-access-jxkl6") pod "da67b3db-e99c-4011-8071-0488afd0ab23" (UID: "da67b3db-e99c-4011-8071-0488afd0ab23"). InnerVolumeSpecName "kube-api-access-jxkl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.754317 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-config-data" (OuterVolumeSpecName: "config-data") pod "da67b3db-e99c-4011-8071-0488afd0ab23" (UID: "da67b3db-e99c-4011-8071-0488afd0ab23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.767466 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da67b3db-e99c-4011-8071-0488afd0ab23" (UID: "da67b3db-e99c-4011-8071-0488afd0ab23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.818746 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.819061 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da67b3db-e99c-4011-8071-0488afd0ab23-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.819077 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxkl6\" (UniqueName: \"kubernetes.io/projected/da67b3db-e99c-4011-8071-0488afd0ab23-kube-api-access-jxkl6\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:53 crc kubenswrapper[4475]: I1203 07:02:53.819092 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da67b3db-e99c-4011-8071-0488afd0ab23-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.009434 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:02:54 crc kubenswrapper[4475]: W1203 07:02:54.024261 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f520204_42e9_45b7_ad44_a659f8c42b74.slice/crio-906af082340f7ee25d7df103d349f6d93e936aebce5bde415333c11820747cfb WatchSource:0}: Error finding container 906af082340f7ee25d7df103d349f6d93e936aebce5bde415333c11820747cfb: Status 404 returned error can't find the container with id 906af082340f7ee25d7df103d349f6d93e936aebce5bde415333c11820747cfb Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.036211 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f520204-42e9-45b7-ad44-a659f8c42b74","Type":"ContainerStarted","Data":"906af082340f7ee25d7df103d349f6d93e936aebce5bde415333c11820747cfb"} Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.041759 4475 generic.go:334] "Generic (PLEG): container finished" podID="da67b3db-e99c-4011-8071-0488afd0ab23" containerID="b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7" exitCode=0 Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.041808 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da67b3db-e99c-4011-8071-0488afd0ab23","Type":"ContainerDied","Data":"b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7"} Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.041834 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da67b3db-e99c-4011-8071-0488afd0ab23","Type":"ContainerDied","Data":"5ea1bf8e510b3b8e920bbf52c0b3e6f788add4fc8c0127884ff459dd680498a3"} Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.041844 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.041850 4475 scope.go:117] "RemoveContainer" containerID="b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.061853 4475 scope.go:117] "RemoveContainer" containerID="a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.075410 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.085374 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.096787 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:54 crc kubenswrapper[4475]: E1203 07:02:54.097168 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da67b3db-e99c-4011-8071-0488afd0ab23" containerName="nova-api-api" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.097184 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="da67b3db-e99c-4011-8071-0488afd0ab23" containerName="nova-api-api" Dec 03 07:02:54 crc kubenswrapper[4475]: E1203 07:02:54.097221 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da67b3db-e99c-4011-8071-0488afd0ab23" containerName="nova-api-log" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.097227 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="da67b3db-e99c-4011-8071-0488afd0ab23" containerName="nova-api-log" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.097402 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="da67b3db-e99c-4011-8071-0488afd0ab23" containerName="nova-api-api" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.097425 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="da67b3db-e99c-4011-8071-0488afd0ab23" containerName="nova-api-log" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.098398 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.102393 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.102702 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.102891 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.103073 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.103268 4475 scope.go:117] "RemoveContainer" containerID="b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7" Dec 03 07:02:54 crc kubenswrapper[4475]: E1203 07:02:54.103602 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7\": container with ID starting with b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7 not found: ID does not exist" containerID="b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.103632 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7"} err="failed to get container status \"b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7\": rpc error: code = NotFound desc = could not find container \"b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7\": container with ID starting with b06e9eb716a12048842f5de788a954e3ac022b377bbd45384eb486852c4f7ee7 not found: ID does not exist" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.103649 4475 scope.go:117] "RemoveContainer" containerID="a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b" Dec 03 07:02:54 crc kubenswrapper[4475]: E1203 07:02:54.103974 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b\": container with ID starting with a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b not found: ID does not exist" containerID="a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.104000 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b"} err="failed to get container status \"a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b\": rpc error: code = NotFound desc = could not find container \"a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b\": container with ID starting with a5d7cad60fd1a5dea546fe23eb52935b1ea09838d3ca514f997d77ffd0720d8b not found: ID does not exist" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.125139 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.125211 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60efc3c2-1541-4def-91d8-11fef3e401bd-logs\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.125289 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.125353 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-config-data\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.125370 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-public-tls-certs\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.125390 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw96s\" (UniqueName: \"kubernetes.io/projected/60efc3c2-1541-4def-91d8-11fef3e401bd-kube-api-access-jw96s\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.227236 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.227321 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60efc3c2-1541-4def-91d8-11fef3e401bd-logs\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.227774 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60efc3c2-1541-4def-91d8-11fef3e401bd-logs\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.227826 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.227871 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-config-data\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.227886 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-public-tls-certs\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.227910 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw96s\" (UniqueName: \"kubernetes.io/projected/60efc3c2-1541-4def-91d8-11fef3e401bd-kube-api-access-jw96s\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.233335 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.234177 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-config-data\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.234433 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-public-tls-certs\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.234878 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.245815 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw96s\" (UniqueName: \"kubernetes.io/projected/60efc3c2-1541-4def-91d8-11fef3e401bd-kube-api-access-jw96s\") pod \"nova-api-0\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.414183 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:02:54 crc kubenswrapper[4475]: I1203 07:02:54.851131 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:02:54 crc kubenswrapper[4475]: W1203 07:02:54.863343 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60efc3c2_1541_4def_91d8_11fef3e401bd.slice/crio-fdc923247bb274d02a972667aab19038e3604e879ce047c04596b77502536deb WatchSource:0}: Error finding container fdc923247bb274d02a972667aab19038e3604e879ce047c04596b77502536deb: Status 404 returned error can't find the container with id fdc923247bb274d02a972667aab19038e3604e879ce047c04596b77502536deb Dec 03 07:02:55 crc kubenswrapper[4475]: I1203 07:02:55.053863 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60efc3c2-1541-4def-91d8-11fef3e401bd","Type":"ContainerStarted","Data":"9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed"} Dec 03 07:02:55 crc kubenswrapper[4475]: I1203 07:02:55.054103 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60efc3c2-1541-4def-91d8-11fef3e401bd","Type":"ContainerStarted","Data":"fdc923247bb274d02a972667aab19038e3604e879ce047c04596b77502536deb"} Dec 03 07:02:55 crc kubenswrapper[4475]: I1203 07:02:55.055478 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f520204-42e9-45b7-ad44-a659f8c42b74","Type":"ContainerStarted","Data":"90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b"} Dec 03 07:02:55 crc kubenswrapper[4475]: I1203 07:02:55.502897 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da67b3db-e99c-4011-8071-0488afd0ab23" path="/var/lib/kubelet/pods/da67b3db-e99c-4011-8071-0488afd0ab23/volumes" Dec 03 07:02:56 crc kubenswrapper[4475]: I1203 07:02:56.067872 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f520204-42e9-45b7-ad44-a659f8c42b74","Type":"ContainerStarted","Data":"b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09"} Dec 03 07:02:56 crc kubenswrapper[4475]: I1203 07:02:56.069746 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60efc3c2-1541-4def-91d8-11fef3e401bd","Type":"ContainerStarted","Data":"ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160"} Dec 03 07:02:56 crc kubenswrapper[4475]: I1203 07:02:56.090219 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.090202643 podStartE2EDuration="2.090202643s" podCreationTimestamp="2025-12-03 07:02:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:02:56.084352378 +0000 UTC m=+1060.889250713" watchObservedRunningTime="2025-12-03 07:02:56.090202643 +0000 UTC m=+1060.895100976" Dec 03 07:02:57 crc kubenswrapper[4475]: I1203 07:02:57.078633 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f520204-42e9-45b7-ad44-a659f8c42b74","Type":"ContainerStarted","Data":"0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b"} Dec 03 07:02:57 crc kubenswrapper[4475]: I1203 07:02:57.354007 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:57 crc kubenswrapper[4475]: I1203 07:02:57.370254 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:57 crc kubenswrapper[4475]: I1203 07:02:57.485584 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:02:57 crc kubenswrapper[4475]: I1203 07:02:57.556810 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c696b9c6c-wpmmb"] Dec 03 07:02:57 crc kubenswrapper[4475]: I1203 07:02:57.557038 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" podUID="06823879-29ce-4d3b-bd43-61c0891eaa99" containerName="dnsmasq-dns" containerID="cri-o://446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019" gracePeriod=10 Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.000796 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.087337 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f520204-42e9-45b7-ad44-a659f8c42b74","Type":"ContainerStarted","Data":"0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9"} Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.088953 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.091606 4475 generic.go:334] "Generic (PLEG): container finished" podID="06823879-29ce-4d3b-bd43-61c0891eaa99" containerID="446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019" exitCode=0 Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.091856 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" event={"ID":"06823879-29ce-4d3b-bd43-61c0891eaa99","Type":"ContainerDied","Data":"446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019"} Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.091939 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" event={"ID":"06823879-29ce-4d3b-bd43-61c0891eaa99","Type":"ContainerDied","Data":"18d3683412f8546b6b2f323ae8f44a456c467a869e2333c80700c7568f56cade"} Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.091969 4475 scope.go:117] "RemoveContainer" containerID="446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.092743 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c696b9c6c-wpmmb" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.108432 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.32761982 podStartE2EDuration="5.108409296s" podCreationTimestamp="2025-12-03 07:02:53 +0000 UTC" firstStartedPulling="2025-12-03 07:02:54.030254773 +0000 UTC m=+1058.835153107" lastFinishedPulling="2025-12-03 07:02:57.811044249 +0000 UTC m=+1062.615942583" observedRunningTime="2025-12-03 07:02:58.102940829 +0000 UTC m=+1062.907839164" watchObservedRunningTime="2025-12-03 07:02:58.108409296 +0000 UTC m=+1062.913307629" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.110350 4475 scope.go:117] "RemoveContainer" containerID="2d894f84e425ac28d916a64b73383e2bde66ae286d5963495998e9b37a054ff8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.122966 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.124712 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-config\") pod \"06823879-29ce-4d3b-bd43-61c0891eaa99\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.124817 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96r24\" (UniqueName: \"kubernetes.io/projected/06823879-29ce-4d3b-bd43-61c0891eaa99-kube-api-access-96r24\") pod \"06823879-29ce-4d3b-bd43-61c0891eaa99\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.124855 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-swift-storage-0\") pod \"06823879-29ce-4d3b-bd43-61c0891eaa99\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.124952 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-svc\") pod \"06823879-29ce-4d3b-bd43-61c0891eaa99\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.125064 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-sb\") pod \"06823879-29ce-4d3b-bd43-61c0891eaa99\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.125287 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-nb\") pod \"06823879-29ce-4d3b-bd43-61c0891eaa99\" (UID: \"06823879-29ce-4d3b-bd43-61c0891eaa99\") " Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.133446 4475 scope.go:117] "RemoveContainer" containerID="446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019" Dec 03 07:02:58 crc kubenswrapper[4475]: E1203 07:02:58.134152 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019\": container with ID starting with 446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019 not found: ID does not exist" containerID="446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.134183 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019"} err="failed to get container status \"446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019\": rpc error: code = NotFound desc = could not find container \"446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019\": container with ID starting with 446df6778e12102cfd2e1ce1bc9b07e5bb3e2524193513bc246db9b28fa20019 not found: ID does not exist" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.134204 4475 scope.go:117] "RemoveContainer" containerID="2d894f84e425ac28d916a64b73383e2bde66ae286d5963495998e9b37a054ff8" Dec 03 07:02:58 crc kubenswrapper[4475]: E1203 07:02:58.135139 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d894f84e425ac28d916a64b73383e2bde66ae286d5963495998e9b37a054ff8\": container with ID starting with 2d894f84e425ac28d916a64b73383e2bde66ae286d5963495998e9b37a054ff8 not found: ID does not exist" containerID="2d894f84e425ac28d916a64b73383e2bde66ae286d5963495998e9b37a054ff8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.135156 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d894f84e425ac28d916a64b73383e2bde66ae286d5963495998e9b37a054ff8"} err="failed to get container status \"2d894f84e425ac28d916a64b73383e2bde66ae286d5963495998e9b37a054ff8\": rpc error: code = NotFound desc = could not find container \"2d894f84e425ac28d916a64b73383e2bde66ae286d5963495998e9b37a054ff8\": container with ID starting with 2d894f84e425ac28d916a64b73383e2bde66ae286d5963495998e9b37a054ff8 not found: ID does not exist" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.135845 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06823879-29ce-4d3b-bd43-61c0891eaa99-kube-api-access-96r24" (OuterVolumeSpecName: "kube-api-access-96r24") pod "06823879-29ce-4d3b-bd43-61c0891eaa99" (UID: "06823879-29ce-4d3b-bd43-61c0891eaa99"). InnerVolumeSpecName "kube-api-access-96r24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.224585 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-config" (OuterVolumeSpecName: "config") pod "06823879-29ce-4d3b-bd43-61c0891eaa99" (UID: "06823879-29ce-4d3b-bd43-61c0891eaa99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.227907 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96r24\" (UniqueName: \"kubernetes.io/projected/06823879-29ce-4d3b-bd43-61c0891eaa99-kube-api-access-96r24\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.227934 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.234600 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06823879-29ce-4d3b-bd43-61c0891eaa99" (UID: "06823879-29ce-4d3b-bd43-61c0891eaa99"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.238827 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "06823879-29ce-4d3b-bd43-61c0891eaa99" (UID: "06823879-29ce-4d3b-bd43-61c0891eaa99"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.245480 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06823879-29ce-4d3b-bd43-61c0891eaa99" (UID: "06823879-29ce-4d3b-bd43-61c0891eaa99"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.246933 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06823879-29ce-4d3b-bd43-61c0891eaa99" (UID: "06823879-29ce-4d3b-bd43-61c0891eaa99"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.311891 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-24dq8"] Dec 03 07:02:58 crc kubenswrapper[4475]: E1203 07:02:58.312544 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06823879-29ce-4d3b-bd43-61c0891eaa99" containerName="init" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.312563 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="06823879-29ce-4d3b-bd43-61c0891eaa99" containerName="init" Dec 03 07:02:58 crc kubenswrapper[4475]: E1203 07:02:58.312583 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06823879-29ce-4d3b-bd43-61c0891eaa99" containerName="dnsmasq-dns" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.312589 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="06823879-29ce-4d3b-bd43-61c0891eaa99" containerName="dnsmasq-dns" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.312801 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="06823879-29ce-4d3b-bd43-61c0891eaa99" containerName="dnsmasq-dns" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.313632 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.317759 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.318254 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.326701 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-24dq8"] Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.331072 4475 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.331098 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.331121 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.331135 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06823879-29ce-4d3b-bd43-61c0891eaa99-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.430379 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c696b9c6c-wpmmb"] Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.432833 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhszf\" (UniqueName: \"kubernetes.io/projected/bb4749d8-e5e2-488a-a007-028e214d6d95-kube-api-access-xhszf\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.433374 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-scripts\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.433463 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.433497 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-config-data\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.439714 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c696b9c6c-wpmmb"] Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.535939 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.536445 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-config-data\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.536611 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhszf\" (UniqueName: \"kubernetes.io/projected/bb4749d8-e5e2-488a-a007-028e214d6d95-kube-api-access-xhszf\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.536754 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-scripts\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.540588 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-scripts\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.541213 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.541639 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-config-data\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.552312 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhszf\" (UniqueName: \"kubernetes.io/projected/bb4749d8-e5e2-488a-a007-028e214d6d95-kube-api-access-xhszf\") pod \"nova-cell1-cell-mapping-24dq8\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.637087 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.947623 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.947951 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.948015 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.948649 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"001eb8a40dd541fdfa62c93940e55ef947928ce582f2778a9f17df66253e35b4"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:02:58 crc kubenswrapper[4475]: I1203 07:02:58.948734 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://001eb8a40dd541fdfa62c93940e55ef947928ce582f2778a9f17df66253e35b4" gracePeriod=600 Dec 03 07:02:59 crc kubenswrapper[4475]: I1203 07:02:59.058946 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-24dq8"] Dec 03 07:02:59 crc kubenswrapper[4475]: I1203 07:02:59.156147 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="001eb8a40dd541fdfa62c93940e55ef947928ce582f2778a9f17df66253e35b4" exitCode=0 Dec 03 07:02:59 crc kubenswrapper[4475]: I1203 07:02:59.156190 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"001eb8a40dd541fdfa62c93940e55ef947928ce582f2778a9f17df66253e35b4"} Dec 03 07:02:59 crc kubenswrapper[4475]: I1203 07:02:59.157377 4475 scope.go:117] "RemoveContainer" containerID="9e442459db76920abc97188abe20663d3f8869ff7e3f567458064e516a3ad52c" Dec 03 07:02:59 crc kubenswrapper[4475]: I1203 07:02:59.501438 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06823879-29ce-4d3b-bd43-61c0891eaa99" path="/var/lib/kubelet/pods/06823879-29ce-4d3b-bd43-61c0891eaa99/volumes" Dec 03 07:03:00 crc kubenswrapper[4475]: I1203 07:03:00.178478 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-24dq8" event={"ID":"bb4749d8-e5e2-488a-a007-028e214d6d95","Type":"ContainerStarted","Data":"51c1ad9b86c78f206c8c5b233b107533a9b98c853d733106cb113c9c4dd81e1e"} Dec 03 07:03:00 crc kubenswrapper[4475]: I1203 07:03:00.178905 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-24dq8" event={"ID":"bb4749d8-e5e2-488a-a007-028e214d6d95","Type":"ContainerStarted","Data":"2d66015e93b50beb5fb3bb65f19d5f01d24f1e947292df6db9f73bfeca5f4f60"} Dec 03 07:03:00 crc kubenswrapper[4475]: I1203 07:03:00.184364 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"1464654d0e3e46198ab49244bf31d5b6b7a77079e850cf4c97ff1472e570dfc1"} Dec 03 07:03:00 crc kubenswrapper[4475]: I1203 07:03:00.198193 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-24dq8" podStartSLOduration=2.198174534 podStartE2EDuration="2.198174534s" podCreationTimestamp="2025-12-03 07:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:03:00.190944434 +0000 UTC m=+1064.995842767" watchObservedRunningTime="2025-12-03 07:03:00.198174534 +0000 UTC m=+1065.003072868" Dec 03 07:03:03 crc kubenswrapper[4475]: E1203 07:03:03.109098 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e2659c_bdb8_4015_b749_9d1bfd70620f.slice/crio-f38adfa30fd4ebb78cee694486062d19d33d13cf2c50659bb9a0ff66b6d44c52\": RecentStats: unable to find data in memory cache]" Dec 03 07:03:03 crc kubenswrapper[4475]: I1203 07:03:03.209742 4475 generic.go:334] "Generic (PLEG): container finished" podID="bb4749d8-e5e2-488a-a007-028e214d6d95" containerID="51c1ad9b86c78f206c8c5b233b107533a9b98c853d733106cb113c9c4dd81e1e" exitCode=0 Dec 03 07:03:03 crc kubenswrapper[4475]: I1203 07:03:03.209834 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-24dq8" event={"ID":"bb4749d8-e5e2-488a-a007-028e214d6d95","Type":"ContainerDied","Data":"51c1ad9b86c78f206c8c5b233b107533a9b98c853d733106cb113c9c4dd81e1e"} Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.415262 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.415324 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.485571 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.570968 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-config-data\") pod \"bb4749d8-e5e2-488a-a007-028e214d6d95\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.571127 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-combined-ca-bundle\") pod \"bb4749d8-e5e2-488a-a007-028e214d6d95\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.571159 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhszf\" (UniqueName: \"kubernetes.io/projected/bb4749d8-e5e2-488a-a007-028e214d6d95-kube-api-access-xhszf\") pod \"bb4749d8-e5e2-488a-a007-028e214d6d95\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.571186 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-scripts\") pod \"bb4749d8-e5e2-488a-a007-028e214d6d95\" (UID: \"bb4749d8-e5e2-488a-a007-028e214d6d95\") " Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.592421 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb4749d8-e5e2-488a-a007-028e214d6d95-kube-api-access-xhszf" (OuterVolumeSpecName: "kube-api-access-xhszf") pod "bb4749d8-e5e2-488a-a007-028e214d6d95" (UID: "bb4749d8-e5e2-488a-a007-028e214d6d95"). InnerVolumeSpecName "kube-api-access-xhszf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.596419 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-scripts" (OuterVolumeSpecName: "scripts") pod "bb4749d8-e5e2-488a-a007-028e214d6d95" (UID: "bb4749d8-e5e2-488a-a007-028e214d6d95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.600548 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb4749d8-e5e2-488a-a007-028e214d6d95" (UID: "bb4749d8-e5e2-488a-a007-028e214d6d95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.612538 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-config-data" (OuterVolumeSpecName: "config-data") pod "bb4749d8-e5e2-488a-a007-028e214d6d95" (UID: "bb4749d8-e5e2-488a-a007-028e214d6d95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.674227 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhszf\" (UniqueName: \"kubernetes.io/projected/bb4749d8-e5e2-488a-a007-028e214d6d95-kube-api-access-xhszf\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.674257 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.674267 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:04 crc kubenswrapper[4475]: I1203 07:03:04.674276 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4749d8-e5e2-488a-a007-028e214d6d95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.225921 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-24dq8" event={"ID":"bb4749d8-e5e2-488a-a007-028e214d6d95","Type":"ContainerDied","Data":"2d66015e93b50beb5fb3bb65f19d5f01d24f1e947292df6db9f73bfeca5f4f60"} Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.226093 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d66015e93b50beb5fb3bb65f19d5f01d24f1e947292df6db9f73bfeca5f4f60" Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.226166 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-24dq8" Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.389599 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.389780 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerName="nova-api-log" containerID="cri-o://9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed" gracePeriod=30 Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.390050 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerName="nova-api-api" containerID="cri-o://ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160" gracePeriod=30 Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.395315 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": EOF" Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.396325 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": EOF" Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.417417 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.417794 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-log" containerID="cri-o://6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5" gracePeriod=30 Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.417959 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-metadata" containerID="cri-o://d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09" gracePeriod=30 Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.428152 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:03:05 crc kubenswrapper[4475]: I1203 07:03:05.428421 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="daf693e8-1494-4da1-afdd-2cd6dbef665d" containerName="nova-scheduler-scheduler" containerID="cri-o://72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882" gracePeriod=30 Dec 03 07:03:06 crc kubenswrapper[4475]: I1203 07:03:06.251657 4475 generic.go:334] "Generic (PLEG): container finished" podID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerID="9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed" exitCode=143 Dec 03 07:03:06 crc kubenswrapper[4475]: I1203 07:03:06.252203 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60efc3c2-1541-4def-91d8-11fef3e401bd","Type":"ContainerDied","Data":"9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed"} Dec 03 07:03:06 crc kubenswrapper[4475]: I1203 07:03:06.259069 4475 generic.go:334] "Generic (PLEG): container finished" podID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerID="6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5" exitCode=143 Dec 03 07:03:06 crc kubenswrapper[4475]: I1203 07:03:06.259122 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f","Type":"ContainerDied","Data":"6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5"} Dec 03 07:03:08 crc kubenswrapper[4475]: I1203 07:03:08.547035 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": read tcp 10.217.0.2:58804->10.217.0.207:8775: read: connection reset by peer" Dec 03 07:03:08 crc kubenswrapper[4475]: I1203 07:03:08.547067 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": read tcp 10.217.0.2:58806->10.217.0.207:8775: read: connection reset by peer" Dec 03 07:03:08 crc kubenswrapper[4475]: I1203 07:03:08.981732 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.146216 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-nova-metadata-tls-certs\") pod \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.146534 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-logs\") pod \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.146567 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7nnd\" (UniqueName: \"kubernetes.io/projected/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-kube-api-access-d7nnd\") pod \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.146650 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-combined-ca-bundle\") pod \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.146696 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-config-data\") pod \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\" (UID: \"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f\") " Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.148914 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-logs" (OuterVolumeSpecName: "logs") pod "b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" (UID: "b52f22c6-66a3-4f58-aff9-40feb4dcbf8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.161688 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-kube-api-access-d7nnd" (OuterVolumeSpecName: "kube-api-access-d7nnd") pod "b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" (UID: "b52f22c6-66a3-4f58-aff9-40feb4dcbf8f"). InnerVolumeSpecName "kube-api-access-d7nnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.169779 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.177303 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-config-data" (OuterVolumeSpecName: "config-data") pod "b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" (UID: "b52f22c6-66a3-4f58-aff9-40feb4dcbf8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.206328 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" (UID: "b52f22c6-66a3-4f58-aff9-40feb4dcbf8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.216871 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" (UID: "b52f22c6-66a3-4f58-aff9-40feb4dcbf8f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.249382 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.249408 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.249417 4475 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.249434 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.249441 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7nnd\" (UniqueName: \"kubernetes.io/projected/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f-kube-api-access-d7nnd\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.280502 4475 generic.go:334] "Generic (PLEG): container finished" podID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerID="d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09" exitCode=0 Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.280551 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.280569 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f","Type":"ContainerDied","Data":"d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09"} Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.280596 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b52f22c6-66a3-4f58-aff9-40feb4dcbf8f","Type":"ContainerDied","Data":"679a6d2d5e89a9c9a181117fe8aed189e021c9f36944c08c1de69710d2b6852e"} Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.280612 4475 scope.go:117] "RemoveContainer" containerID="d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.286916 4475 generic.go:334] "Generic (PLEG): container finished" podID="daf693e8-1494-4da1-afdd-2cd6dbef665d" containerID="72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882" exitCode=0 Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.286953 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"daf693e8-1494-4da1-afdd-2cd6dbef665d","Type":"ContainerDied","Data":"72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882"} Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.286979 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"daf693e8-1494-4da1-afdd-2cd6dbef665d","Type":"ContainerDied","Data":"d81f73ccbee54b25fda537d67990f2d5c792ef159a8ca28d206b23c7f3a014e4"} Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.287011 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.306133 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.308942 4475 scope.go:117] "RemoveContainer" containerID="6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.328203 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.334986 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.335386 4475 scope.go:117] "RemoveContainer" containerID="d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09" Dec 03 07:03:09 crc kubenswrapper[4475]: E1203 07:03:09.335485 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-metadata" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.335503 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-metadata" Dec 03 07:03:09 crc kubenswrapper[4475]: E1203 07:03:09.335534 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf693e8-1494-4da1-afdd-2cd6dbef665d" containerName="nova-scheduler-scheduler" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.335540 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf693e8-1494-4da1-afdd-2cd6dbef665d" containerName="nova-scheduler-scheduler" Dec 03 07:03:09 crc kubenswrapper[4475]: E1203 07:03:09.335548 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-log" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.335554 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-log" Dec 03 07:03:09 crc kubenswrapper[4475]: E1203 07:03:09.335563 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4749d8-e5e2-488a-a007-028e214d6d95" containerName="nova-manage" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.335568 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4749d8-e5e2-488a-a007-028e214d6d95" containerName="nova-manage" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.335738 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-metadata" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.335755 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" containerName="nova-metadata-log" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.335767 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="daf693e8-1494-4da1-afdd-2cd6dbef665d" containerName="nova-scheduler-scheduler" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.335778 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4749d8-e5e2-488a-a007-028e214d6d95" containerName="nova-manage" Dec 03 07:03:09 crc kubenswrapper[4475]: E1203 07:03:09.335792 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09\": container with ID starting with d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09 not found: ID does not exist" containerID="d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.335816 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09"} err="failed to get container status \"d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09\": rpc error: code = NotFound desc = could not find container \"d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09\": container with ID starting with d8623dc920e9f565014e0552a5750cd818f349aa0e0efcf34cd5682da8d7fa09 not found: ID does not exist" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.335834 4475 scope.go:117] "RemoveContainer" containerID="6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5" Dec 03 07:03:09 crc kubenswrapper[4475]: E1203 07:03:09.336110 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5\": container with ID starting with 6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5 not found: ID does not exist" containerID="6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.336143 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5"} err="failed to get container status \"6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5\": rpc error: code = NotFound desc = could not find container \"6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5\": container with ID starting with 6a47bc6f5f3d3c78c59650aff54b81c8cfbef488f4647b689f43016a9cb851c5 not found: ID does not exist" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.336160 4475 scope.go:117] "RemoveContainer" containerID="72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.336750 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.340399 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.340826 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.352089 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx5wm\" (UniqueName: \"kubernetes.io/projected/daf693e8-1494-4da1-afdd-2cd6dbef665d-kube-api-access-xx5wm\") pod \"daf693e8-1494-4da1-afdd-2cd6dbef665d\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.352174 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-config-data\") pod \"daf693e8-1494-4da1-afdd-2cd6dbef665d\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.352231 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-combined-ca-bundle\") pod \"daf693e8-1494-4da1-afdd-2cd6dbef665d\" (UID: \"daf693e8-1494-4da1-afdd-2cd6dbef665d\") " Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.352420 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a9a05b-6714-48bd-af6c-297bbcfac2e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.352513 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk4xp\" (UniqueName: \"kubernetes.io/projected/84a9a05b-6714-48bd-af6c-297bbcfac2e5-kube-api-access-nk4xp\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.352566 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84a9a05b-6714-48bd-af6c-297bbcfac2e5-logs\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.352593 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a9a05b-6714-48bd-af6c-297bbcfac2e5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.352677 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a9a05b-6714-48bd-af6c-297bbcfac2e5-config-data\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.359604 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daf693e8-1494-4da1-afdd-2cd6dbef665d-kube-api-access-xx5wm" (OuterVolumeSpecName: "kube-api-access-xx5wm") pod "daf693e8-1494-4da1-afdd-2cd6dbef665d" (UID: "daf693e8-1494-4da1-afdd-2cd6dbef665d"). InnerVolumeSpecName "kube-api-access-xx5wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.369127 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.371746 4475 scope.go:117] "RemoveContainer" containerID="72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882" Dec 03 07:03:09 crc kubenswrapper[4475]: E1203 07:03:09.372517 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882\": container with ID starting with 72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882 not found: ID does not exist" containerID="72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.372552 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882"} err="failed to get container status \"72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882\": rpc error: code = NotFound desc = could not find container \"72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882\": container with ID starting with 72b4345cea1c1e15e8d6caa3308a752f06bbc7dfcac445cc393a33540724d882 not found: ID does not exist" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.392902 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "daf693e8-1494-4da1-afdd-2cd6dbef665d" (UID: "daf693e8-1494-4da1-afdd-2cd6dbef665d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.397263 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-config-data" (OuterVolumeSpecName: "config-data") pod "daf693e8-1494-4da1-afdd-2cd6dbef665d" (UID: "daf693e8-1494-4da1-afdd-2cd6dbef665d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.453839 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a9a05b-6714-48bd-af6c-297bbcfac2e5-config-data\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.453903 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a9a05b-6714-48bd-af6c-297bbcfac2e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.453949 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk4xp\" (UniqueName: \"kubernetes.io/projected/84a9a05b-6714-48bd-af6c-297bbcfac2e5-kube-api-access-nk4xp\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.453991 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84a9a05b-6714-48bd-af6c-297bbcfac2e5-logs\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.454013 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a9a05b-6714-48bd-af6c-297bbcfac2e5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.454088 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.454100 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf693e8-1494-4da1-afdd-2cd6dbef665d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.454109 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx5wm\" (UniqueName: \"kubernetes.io/projected/daf693e8-1494-4da1-afdd-2cd6dbef665d-kube-api-access-xx5wm\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.454590 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84a9a05b-6714-48bd-af6c-297bbcfac2e5-logs\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.458061 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a9a05b-6714-48bd-af6c-297bbcfac2e5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.463081 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a9a05b-6714-48bd-af6c-297bbcfac2e5-config-data\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.463170 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a9a05b-6714-48bd-af6c-297bbcfac2e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.469928 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk4xp\" (UniqueName: \"kubernetes.io/projected/84a9a05b-6714-48bd-af6c-297bbcfac2e5-kube-api-access-nk4xp\") pod \"nova-metadata-0\" (UID: \"84a9a05b-6714-48bd-af6c-297bbcfac2e5\") " pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.499550 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b52f22c6-66a3-4f58-aff9-40feb4dcbf8f" path="/var/lib/kubelet/pods/b52f22c6-66a3-4f58-aff9-40feb4dcbf8f/volumes" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.607372 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.615519 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.646525 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.650890 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.655991 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.673524 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.703309 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.774815 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d0eb6e-8db8-4af4-a6b8-6bb3516613d4-config-data\") pod \"nova-scheduler-0\" (UID: \"67d0eb6e-8db8-4af4-a6b8-6bb3516613d4\") " pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.775151 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnd9z\" (UniqueName: \"kubernetes.io/projected/67d0eb6e-8db8-4af4-a6b8-6bb3516613d4-kube-api-access-cnd9z\") pod \"nova-scheduler-0\" (UID: \"67d0eb6e-8db8-4af4-a6b8-6bb3516613d4\") " pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.775401 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d0eb6e-8db8-4af4-a6b8-6bb3516613d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67d0eb6e-8db8-4af4-a6b8-6bb3516613d4\") " pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.876882 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d0eb6e-8db8-4af4-a6b8-6bb3516613d4-config-data\") pod \"nova-scheduler-0\" (UID: \"67d0eb6e-8db8-4af4-a6b8-6bb3516613d4\") " pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.876961 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnd9z\" (UniqueName: \"kubernetes.io/projected/67d0eb6e-8db8-4af4-a6b8-6bb3516613d4-kube-api-access-cnd9z\") pod \"nova-scheduler-0\" (UID: \"67d0eb6e-8db8-4af4-a6b8-6bb3516613d4\") " pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.877030 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d0eb6e-8db8-4af4-a6b8-6bb3516613d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67d0eb6e-8db8-4af4-a6b8-6bb3516613d4\") " pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.881213 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d0eb6e-8db8-4af4-a6b8-6bb3516613d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67d0eb6e-8db8-4af4-a6b8-6bb3516613d4\") " pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.881418 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d0eb6e-8db8-4af4-a6b8-6bb3516613d4-config-data\") pod \"nova-scheduler-0\" (UID: \"67d0eb6e-8db8-4af4-a6b8-6bb3516613d4\") " pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.896395 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnd9z\" (UniqueName: \"kubernetes.io/projected/67d0eb6e-8db8-4af4-a6b8-6bb3516613d4-kube-api-access-cnd9z\") pod \"nova-scheduler-0\" (UID: \"67d0eb6e-8db8-4af4-a6b8-6bb3516613d4\") " pod="openstack/nova-scheduler-0" Dec 03 07:03:09 crc kubenswrapper[4475]: I1203 07:03:09.998495 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:03:10 crc kubenswrapper[4475]: I1203 07:03:10.064946 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:03:10 crc kubenswrapper[4475]: I1203 07:03:10.297534 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84a9a05b-6714-48bd-af6c-297bbcfac2e5","Type":"ContainerStarted","Data":"b8aab7ca649360f2c62d6b683c9b7c3c36093748414cb0b17f14c06d11262c85"} Dec 03 07:03:10 crc kubenswrapper[4475]: I1203 07:03:10.297672 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84a9a05b-6714-48bd-af6c-297bbcfac2e5","Type":"ContainerStarted","Data":"6c9df30ca0c8f4bc9ae641029be9e8d42fa82306a97e0dad8f8d176be8128460"} Dec 03 07:03:10 crc kubenswrapper[4475]: I1203 07:03:10.384547 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:03:10 crc kubenswrapper[4475]: W1203 07:03:10.386361 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67d0eb6e_8db8_4af4_a6b8_6bb3516613d4.slice/crio-cb147bc8e8427227aa8456600b5635c3ad4b8a01ee7de61222aed03872495dea WatchSource:0}: Error finding container cb147bc8e8427227aa8456600b5635c3ad4b8a01ee7de61222aed03872495dea: Status 404 returned error can't find the container with id cb147bc8e8427227aa8456600b5635c3ad4b8a01ee7de61222aed03872495dea Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.105169 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.299001 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw96s\" (UniqueName: \"kubernetes.io/projected/60efc3c2-1541-4def-91d8-11fef3e401bd-kube-api-access-jw96s\") pod \"60efc3c2-1541-4def-91d8-11fef3e401bd\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.299090 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60efc3c2-1541-4def-91d8-11fef3e401bd-logs\") pod \"60efc3c2-1541-4def-91d8-11fef3e401bd\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.299109 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-internal-tls-certs\") pod \"60efc3c2-1541-4def-91d8-11fef3e401bd\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.299149 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-combined-ca-bundle\") pod \"60efc3c2-1541-4def-91d8-11fef3e401bd\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.299171 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-public-tls-certs\") pod \"60efc3c2-1541-4def-91d8-11fef3e401bd\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.299199 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-config-data\") pod \"60efc3c2-1541-4def-91d8-11fef3e401bd\" (UID: \"60efc3c2-1541-4def-91d8-11fef3e401bd\") " Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.300332 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60efc3c2-1541-4def-91d8-11fef3e401bd-logs" (OuterVolumeSpecName: "logs") pod "60efc3c2-1541-4def-91d8-11fef3e401bd" (UID: "60efc3c2-1541-4def-91d8-11fef3e401bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.322489 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60efc3c2-1541-4def-91d8-11fef3e401bd" (UID: "60efc3c2-1541-4def-91d8-11fef3e401bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.323429 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60efc3c2-1541-4def-91d8-11fef3e401bd-kube-api-access-jw96s" (OuterVolumeSpecName: "kube-api-access-jw96s") pod "60efc3c2-1541-4def-91d8-11fef3e401bd" (UID: "60efc3c2-1541-4def-91d8-11fef3e401bd"). InnerVolumeSpecName "kube-api-access-jw96s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.329621 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-config-data" (OuterVolumeSpecName: "config-data") pod "60efc3c2-1541-4def-91d8-11fef3e401bd" (UID: "60efc3c2-1541-4def-91d8-11fef3e401bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.331586 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84a9a05b-6714-48bd-af6c-297bbcfac2e5","Type":"ContainerStarted","Data":"9a9e28bef6a04a4a2745a8ceb37156938266ded5fb1a7db81a592401a6f80e39"} Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.337384 4475 generic.go:334] "Generic (PLEG): container finished" podID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerID="ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160" exitCode=0 Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.337437 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60efc3c2-1541-4def-91d8-11fef3e401bd","Type":"ContainerDied","Data":"ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160"} Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.337463 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.337481 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60efc3c2-1541-4def-91d8-11fef3e401bd","Type":"ContainerDied","Data":"fdc923247bb274d02a972667aab19038e3604e879ce047c04596b77502536deb"} Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.337499 4475 scope.go:117] "RemoveContainer" containerID="ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.339274 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67d0eb6e-8db8-4af4-a6b8-6bb3516613d4","Type":"ContainerStarted","Data":"e55c3f67101fabdbba8f6dd643ec6e12ebff8c646b7f17b8a69811529c26e95b"} Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.339298 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67d0eb6e-8db8-4af4-a6b8-6bb3516613d4","Type":"ContainerStarted","Data":"cb147bc8e8427227aa8456600b5635c3ad4b8a01ee7de61222aed03872495dea"} Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.342823 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "60efc3c2-1541-4def-91d8-11fef3e401bd" (UID: "60efc3c2-1541-4def-91d8-11fef3e401bd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.344325 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "60efc3c2-1541-4def-91d8-11fef3e401bd" (UID: "60efc3c2-1541-4def-91d8-11fef3e401bd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.355279 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.35526555 podStartE2EDuration="2.35526555s" podCreationTimestamp="2025-12-03 07:03:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:03:11.344001022 +0000 UTC m=+1076.148899366" watchObservedRunningTime="2025-12-03 07:03:11.35526555 +0000 UTC m=+1076.160163884" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.358182 4475 scope.go:117] "RemoveContainer" containerID="9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.368911 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.368898144 podStartE2EDuration="2.368898144s" podCreationTimestamp="2025-12-03 07:03:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:03:11.363184506 +0000 UTC m=+1076.168082860" watchObservedRunningTime="2025-12-03 07:03:11.368898144 +0000 UTC m=+1076.173796478" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.393906 4475 scope.go:117] "RemoveContainer" containerID="ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160" Dec 03 07:03:11 crc kubenswrapper[4475]: E1203 07:03:11.394230 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160\": container with ID starting with ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160 not found: ID does not exist" containerID="ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.394256 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160"} err="failed to get container status \"ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160\": rpc error: code = NotFound desc = could not find container \"ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160\": container with ID starting with ef74648e0ca0ec7989433f902f59f502f45bec6dcb1b89d9ac93660730f6d160 not found: ID does not exist" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.394277 4475 scope.go:117] "RemoveContainer" containerID="9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed" Dec 03 07:03:11 crc kubenswrapper[4475]: E1203 07:03:11.394683 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed\": container with ID starting with 9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed not found: ID does not exist" containerID="9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.394716 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed"} err="failed to get container status \"9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed\": rpc error: code = NotFound desc = could not find container \"9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed\": container with ID starting with 9034116b39a2c3aefd05c0486c5e13a90729aff3c5870afac0d3d1a3d06545ed not found: ID does not exist" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.401188 4475 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.401208 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.401218 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw96s\" (UniqueName: \"kubernetes.io/projected/60efc3c2-1541-4def-91d8-11fef3e401bd-kube-api-access-jw96s\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.401229 4475 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.401238 4475 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60efc3c2-1541-4def-91d8-11fef3e401bd-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.401246 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60efc3c2-1541-4def-91d8-11fef3e401bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.499156 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daf693e8-1494-4da1-afdd-2cd6dbef665d" path="/var/lib/kubelet/pods/daf693e8-1494-4da1-afdd-2cd6dbef665d/volumes" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.682234 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.688425 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.703036 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 07:03:11 crc kubenswrapper[4475]: E1203 07:03:11.703521 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerName="nova-api-log" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.703579 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerName="nova-api-log" Dec 03 07:03:11 crc kubenswrapper[4475]: E1203 07:03:11.703640 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerName="nova-api-api" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.703680 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerName="nova-api-api" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.703861 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerName="nova-api-api" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.703938 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="60efc3c2-1541-4def-91d8-11fef3e401bd" containerName="nova-api-log" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.704861 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.706437 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.706696 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.706947 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.716093 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-public-tls-certs\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.716166 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-internal-tls-certs\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.716298 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-logs\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.716391 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-config-data\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.716422 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v6zx\" (UniqueName: \"kubernetes.io/projected/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-kube-api-access-6v6zx\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.716501 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.722411 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.818349 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.818395 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-public-tls-certs\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.818445 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-internal-tls-certs\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.818562 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-logs\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.818601 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-config-data\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.818620 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v6zx\" (UniqueName: \"kubernetes.io/projected/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-kube-api-access-6v6zx\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.819983 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-logs\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.825370 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-internal-tls-certs\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.835208 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-public-tls-certs\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.835214 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-config-data\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.835630 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:11 crc kubenswrapper[4475]: I1203 07:03:11.839913 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v6zx\" (UniqueName: \"kubernetes.io/projected/03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88-kube-api-access-6v6zx\") pod \"nova-api-0\" (UID: \"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88\") " pod="openstack/nova-api-0" Dec 03 07:03:12 crc kubenswrapper[4475]: I1203 07:03:12.026083 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:03:12 crc kubenswrapper[4475]: W1203 07:03:12.424349 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03ca50a6_f6a1_4d6b_b7e7_0af317dd3a88.slice/crio-1e300c17d81ada2090ab1be627a29b61efb14e65907257c4da07d1230c2fce32 WatchSource:0}: Error finding container 1e300c17d81ada2090ab1be627a29b61efb14e65907257c4da07d1230c2fce32: Status 404 returned error can't find the container with id 1e300c17d81ada2090ab1be627a29b61efb14e65907257c4da07d1230c2fce32 Dec 03 07:03:12 crc kubenswrapper[4475]: I1203 07:03:12.426194 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:03:13 crc kubenswrapper[4475]: E1203 07:03:13.319136 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e2659c_bdb8_4015_b749_9d1bfd70620f.slice/crio-f38adfa30fd4ebb78cee694486062d19d33d13cf2c50659bb9a0ff66b6d44c52\": RecentStats: unable to find data in memory cache]" Dec 03 07:03:13 crc kubenswrapper[4475]: I1203 07:03:13.357472 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88","Type":"ContainerStarted","Data":"39348b39634d175b63bdce7734df608b4a26e3b8909fe08a7e140ea5847ad330"} Dec 03 07:03:13 crc kubenswrapper[4475]: I1203 07:03:13.357515 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88","Type":"ContainerStarted","Data":"1c686d3076989eb4ea2806ebe01568f419e693ceed70d6daf450c59047240532"} Dec 03 07:03:13 crc kubenswrapper[4475]: I1203 07:03:13.357525 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88","Type":"ContainerStarted","Data":"1e300c17d81ada2090ab1be627a29b61efb14e65907257c4da07d1230c2fce32"} Dec 03 07:03:13 crc kubenswrapper[4475]: I1203 07:03:13.377326 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.37730843 podStartE2EDuration="2.37730843s" podCreationTimestamp="2025-12-03 07:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:03:13.371029769 +0000 UTC m=+1078.175928103" watchObservedRunningTime="2025-12-03 07:03:13.37730843 +0000 UTC m=+1078.182206764" Dec 03 07:03:13 crc kubenswrapper[4475]: I1203 07:03:13.499519 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60efc3c2-1541-4def-91d8-11fef3e401bd" path="/var/lib/kubelet/pods/60efc3c2-1541-4def-91d8-11fef3e401bd/volumes" Dec 03 07:03:14 crc kubenswrapper[4475]: I1203 07:03:14.674660 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:03:14 crc kubenswrapper[4475]: I1203 07:03:14.674865 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:03:14 crc kubenswrapper[4475]: I1203 07:03:14.999663 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 07:03:19 crc kubenswrapper[4475]: I1203 07:03:19.675093 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 07:03:19 crc kubenswrapper[4475]: I1203 07:03:19.675528 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 07:03:19 crc kubenswrapper[4475]: I1203 07:03:19.999145 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 07:03:20 crc kubenswrapper[4475]: I1203 07:03:20.020736 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 07:03:20 crc kubenswrapper[4475]: I1203 07:03:20.428738 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 07:03:20 crc kubenswrapper[4475]: I1203 07:03:20.689586 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="84a9a05b-6714-48bd-af6c-297bbcfac2e5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 07:03:20 crc kubenswrapper[4475]: I1203 07:03:20.689614 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="84a9a05b-6714-48bd-af6c-297bbcfac2e5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 07:03:22 crc kubenswrapper[4475]: I1203 07:03:22.026575 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:03:22 crc kubenswrapper[4475]: I1203 07:03:22.026779 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:03:23 crc kubenswrapper[4475]: I1203 07:03:23.038571 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 07:03:23 crc kubenswrapper[4475]: I1203 07:03:23.038583 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="03ca50a6-f6a1-4d6b-b7e7-0af317dd3a88" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 07:03:23 crc kubenswrapper[4475]: I1203 07:03:23.559702 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 07:03:26 crc kubenswrapper[4475]: I1203 07:03:26.542172 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:03:26 crc kubenswrapper[4475]: I1203 07:03:26.542717 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="fd232cd1-6aca-43cc-9876-535cd9eb39eb" containerName="kube-state-metrics" containerID="cri-o://27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b" gracePeriod=30 Dec 03 07:03:26 crc kubenswrapper[4475]: I1203 07:03:26.971517 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.166794 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kngg9\" (UniqueName: \"kubernetes.io/projected/fd232cd1-6aca-43cc-9876-535cd9eb39eb-kube-api-access-kngg9\") pod \"fd232cd1-6aca-43cc-9876-535cd9eb39eb\" (UID: \"fd232cd1-6aca-43cc-9876-535cd9eb39eb\") " Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.172749 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd232cd1-6aca-43cc-9876-535cd9eb39eb-kube-api-access-kngg9" (OuterVolumeSpecName: "kube-api-access-kngg9") pod "fd232cd1-6aca-43cc-9876-535cd9eb39eb" (UID: "fd232cd1-6aca-43cc-9876-535cd9eb39eb"). InnerVolumeSpecName "kube-api-access-kngg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.269616 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kngg9\" (UniqueName: \"kubernetes.io/projected/fd232cd1-6aca-43cc-9876-535cd9eb39eb-kube-api-access-kngg9\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.459041 4475 generic.go:334] "Generic (PLEG): container finished" podID="fd232cd1-6aca-43cc-9876-535cd9eb39eb" containerID="27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b" exitCode=2 Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.459088 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fd232cd1-6aca-43cc-9876-535cd9eb39eb","Type":"ContainerDied","Data":"27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b"} Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.459110 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.459130 4475 scope.go:117] "RemoveContainer" containerID="27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.459119 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fd232cd1-6aca-43cc-9876-535cd9eb39eb","Type":"ContainerDied","Data":"b90f6eef308feb8fb35c00508576513c29309ca5e6fd44ca1e369659d34d1726"} Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.480534 4475 scope.go:117] "RemoveContainer" containerID="27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b" Dec 03 07:03:27 crc kubenswrapper[4475]: E1203 07:03:27.481914 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b\": container with ID starting with 27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b not found: ID does not exist" containerID="27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.481939 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b"} err="failed to get container status \"27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b\": rpc error: code = NotFound desc = could not find container \"27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b\": container with ID starting with 27b5d750e28b7a2b4a5d00fbd91c782b113e6570522ce95e330baa819b48925b not found: ID does not exist" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.488199 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.504121 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.520133 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:03:27 crc kubenswrapper[4475]: E1203 07:03:27.520906 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd232cd1-6aca-43cc-9876-535cd9eb39eb" containerName="kube-state-metrics" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.520929 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd232cd1-6aca-43cc-9876-535cd9eb39eb" containerName="kube-state-metrics" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.521310 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd232cd1-6aca-43cc-9876-535cd9eb39eb" containerName="kube-state-metrics" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.523629 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.526420 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.526712 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.530527 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.677579 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.677732 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.677792 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzrhg\" (UniqueName: \"kubernetes.io/projected/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-kube-api-access-wzrhg\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.677867 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.779437 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.779522 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.779622 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.779672 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrhg\" (UniqueName: \"kubernetes.io/projected/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-kube-api-access-wzrhg\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.784899 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.785014 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.786440 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.799520 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzrhg\" (UniqueName: \"kubernetes.io/projected/8e0ac5db-edf5-4b28-838f-338fdcb4b6fe-kube-api-access-wzrhg\") pod \"kube-state-metrics-0\" (UID: \"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe\") " pod="openstack/kube-state-metrics-0" Dec 03 07:03:27 crc kubenswrapper[4475]: I1203 07:03:27.841701 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:03:28 crc kubenswrapper[4475]: I1203 07:03:28.205196 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:03:28 crc kubenswrapper[4475]: I1203 07:03:28.205655 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="sg-core" containerID="cri-o://0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b" gracePeriod=30 Dec 03 07:03:28 crc kubenswrapper[4475]: I1203 07:03:28.205703 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="proxy-httpd" containerID="cri-o://0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9" gracePeriod=30 Dec 03 07:03:28 crc kubenswrapper[4475]: I1203 07:03:28.205622 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="ceilometer-central-agent" containerID="cri-o://90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b" gracePeriod=30 Dec 03 07:03:28 crc kubenswrapper[4475]: I1203 07:03:28.205909 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="ceilometer-notification-agent" containerID="cri-o://b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09" gracePeriod=30 Dec 03 07:03:28 crc kubenswrapper[4475]: I1203 07:03:28.243657 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:03:28 crc kubenswrapper[4475]: I1203 07:03:28.469269 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe","Type":"ContainerStarted","Data":"5680747ffff43237251cb919c2d1b5b6b8f25364560d4370a6c45e508ab70300"} Dec 03 07:03:28 crc kubenswrapper[4475]: I1203 07:03:28.472229 4475 generic.go:334] "Generic (PLEG): container finished" podID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerID="0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9" exitCode=0 Dec 03 07:03:28 crc kubenswrapper[4475]: I1203 07:03:28.472257 4475 generic.go:334] "Generic (PLEG): container finished" podID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerID="0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b" exitCode=2 Dec 03 07:03:28 crc kubenswrapper[4475]: I1203 07:03:28.472294 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f520204-42e9-45b7-ad44-a659f8c42b74","Type":"ContainerDied","Data":"0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9"} Dec 03 07:03:28 crc kubenswrapper[4475]: I1203 07:03:28.472320 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f520204-42e9-45b7-ad44-a659f8c42b74","Type":"ContainerDied","Data":"0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b"} Dec 03 07:03:29 crc kubenswrapper[4475]: I1203 07:03:29.483187 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e0ac5db-edf5-4b28-838f-338fdcb4b6fe","Type":"ContainerStarted","Data":"18cd9dcbdd7271674bc4ee382c323521a529293825a744f8a943b0a97d070944"} Dec 03 07:03:29 crc kubenswrapper[4475]: I1203 07:03:29.483638 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 07:03:29 crc kubenswrapper[4475]: I1203 07:03:29.485976 4475 generic.go:334] "Generic (PLEG): container finished" podID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerID="90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b" exitCode=0 Dec 03 07:03:29 crc kubenswrapper[4475]: I1203 07:03:29.486018 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f520204-42e9-45b7-ad44-a659f8c42b74","Type":"ContainerDied","Data":"90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b"} Dec 03 07:03:29 crc kubenswrapper[4475]: I1203 07:03:29.499877 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd232cd1-6aca-43cc-9876-535cd9eb39eb" path="/var/lib/kubelet/pods/fd232cd1-6aca-43cc-9876-535cd9eb39eb/volumes" Dec 03 07:03:29 crc kubenswrapper[4475]: I1203 07:03:29.501794 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.212696254 podStartE2EDuration="2.501778269s" podCreationTimestamp="2025-12-03 07:03:27 +0000 UTC" firstStartedPulling="2025-12-03 07:03:28.24726713 +0000 UTC m=+1093.052165464" lastFinishedPulling="2025-12-03 07:03:28.536349145 +0000 UTC m=+1093.341247479" observedRunningTime="2025-12-03 07:03:29.497997728 +0000 UTC m=+1094.302896082" watchObservedRunningTime="2025-12-03 07:03:29.501778269 +0000 UTC m=+1094.306676603" Dec 03 07:03:29 crc kubenswrapper[4475]: I1203 07:03:29.679178 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 07:03:29 crc kubenswrapper[4475]: I1203 07:03:29.679797 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 07:03:29 crc kubenswrapper[4475]: I1203 07:03:29.690696 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 07:03:30 crc kubenswrapper[4475]: I1203 07:03:30.497031 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.096270 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.137150 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktjsf\" (UniqueName: \"kubernetes.io/projected/7f520204-42e9-45b7-ad44-a659f8c42b74-kube-api-access-ktjsf\") pod \"7f520204-42e9-45b7-ad44-a659f8c42b74\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.137309 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-combined-ca-bundle\") pod \"7f520204-42e9-45b7-ad44-a659f8c42b74\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.137388 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-scripts\") pod \"7f520204-42e9-45b7-ad44-a659f8c42b74\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.137488 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-run-httpd\") pod \"7f520204-42e9-45b7-ad44-a659f8c42b74\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.137561 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-log-httpd\") pod \"7f520204-42e9-45b7-ad44-a659f8c42b74\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.137595 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-config-data\") pod \"7f520204-42e9-45b7-ad44-a659f8c42b74\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.137683 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-sg-core-conf-yaml\") pod \"7f520204-42e9-45b7-ad44-a659f8c42b74\" (UID: \"7f520204-42e9-45b7-ad44-a659f8c42b74\") " Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.139826 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7f520204-42e9-45b7-ad44-a659f8c42b74" (UID: "7f520204-42e9-45b7-ad44-a659f8c42b74"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.140127 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7f520204-42e9-45b7-ad44-a659f8c42b74" (UID: "7f520204-42e9-45b7-ad44-a659f8c42b74"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.163175 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-scripts" (OuterVolumeSpecName: "scripts") pod "7f520204-42e9-45b7-ad44-a659f8c42b74" (UID: "7f520204-42e9-45b7-ad44-a659f8c42b74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.170615 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f520204-42e9-45b7-ad44-a659f8c42b74-kube-api-access-ktjsf" (OuterVolumeSpecName: "kube-api-access-ktjsf") pod "7f520204-42e9-45b7-ad44-a659f8c42b74" (UID: "7f520204-42e9-45b7-ad44-a659f8c42b74"). InnerVolumeSpecName "kube-api-access-ktjsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.216303 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f520204-42e9-45b7-ad44-a659f8c42b74" (UID: "7f520204-42e9-45b7-ad44-a659f8c42b74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.217535 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7f520204-42e9-45b7-ad44-a659f8c42b74" (UID: "7f520204-42e9-45b7-ad44-a659f8c42b74"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.252181 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.252210 4475 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.252221 4475 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.252231 4475 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f520204-42e9-45b7-ad44-a659f8c42b74-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.252239 4475 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.252247 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktjsf\" (UniqueName: \"kubernetes.io/projected/7f520204-42e9-45b7-ad44-a659f8c42b74-kube-api-access-ktjsf\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.269254 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-config-data" (OuterVolumeSpecName: "config-data") pod "7f520204-42e9-45b7-ad44-a659f8c42b74" (UID: "7f520204-42e9-45b7-ad44-a659f8c42b74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.353777 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f520204-42e9-45b7-ad44-a659f8c42b74-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.502171 4475 generic.go:334] "Generic (PLEG): container finished" podID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerID="b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09" exitCode=0 Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.502217 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f520204-42e9-45b7-ad44-a659f8c42b74","Type":"ContainerDied","Data":"b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09"} Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.502238 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.502261 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f520204-42e9-45b7-ad44-a659f8c42b74","Type":"ContainerDied","Data":"906af082340f7ee25d7df103d349f6d93e936aebce5bde415333c11820747cfb"} Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.502284 4475 scope.go:117] "RemoveContainer" containerID="0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.525815 4475 scope.go:117] "RemoveContainer" containerID="0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.533133 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.537221 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.550511 4475 scope.go:117] "RemoveContainer" containerID="b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.565554 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:03:31 crc kubenswrapper[4475]: E1203 07:03:31.565937 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="sg-core" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.565955 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="sg-core" Dec 03 07:03:31 crc kubenswrapper[4475]: E1203 07:03:31.565970 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="proxy-httpd" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.565977 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="proxy-httpd" Dec 03 07:03:31 crc kubenswrapper[4475]: E1203 07:03:31.566003 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="ceilometer-central-agent" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.566009 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="ceilometer-central-agent" Dec 03 07:03:31 crc kubenswrapper[4475]: E1203 07:03:31.566016 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="ceilometer-notification-agent" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.566022 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="ceilometer-notification-agent" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.566219 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="sg-core" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.566235 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="proxy-httpd" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.566249 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="ceilometer-central-agent" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.566269 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" containerName="ceilometer-notification-agent" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.567845 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.578336 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.582658 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.582769 4475 scope.go:117] "RemoveContainer" containerID="90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.582824 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.582907 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.606399 4475 scope.go:117] "RemoveContainer" containerID="0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9" Dec 03 07:03:31 crc kubenswrapper[4475]: E1203 07:03:31.607596 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9\": container with ID starting with 0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9 not found: ID does not exist" containerID="0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.607642 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9"} err="failed to get container status \"0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9\": rpc error: code = NotFound desc = could not find container \"0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9\": container with ID starting with 0242fe01d1dcb20eac884a38656f690611f9f1d035f4ae6ebc981815e1047fc9 not found: ID does not exist" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.607666 4475 scope.go:117] "RemoveContainer" containerID="0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b" Dec 03 07:03:31 crc kubenswrapper[4475]: E1203 07:03:31.607983 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b\": container with ID starting with 0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b not found: ID does not exist" containerID="0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.608016 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b"} err="failed to get container status \"0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b\": rpc error: code = NotFound desc = could not find container \"0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b\": container with ID starting with 0d5870bd8992a9af4666f3e76017df6d3285ab3227e80f7b202f2cd8f3d6cc6b not found: ID does not exist" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.608037 4475 scope.go:117] "RemoveContainer" containerID="b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09" Dec 03 07:03:31 crc kubenswrapper[4475]: E1203 07:03:31.608312 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09\": container with ID starting with b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09 not found: ID does not exist" containerID="b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.608340 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09"} err="failed to get container status \"b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09\": rpc error: code = NotFound desc = could not find container \"b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09\": container with ID starting with b0f8d3e9db4fc1b63a1161aa6cc89d63007a079832a2603ee5f796341f7bce09 not found: ID does not exist" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.608356 4475 scope.go:117] "RemoveContainer" containerID="90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b" Dec 03 07:03:31 crc kubenswrapper[4475]: E1203 07:03:31.608634 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b\": container with ID starting with 90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b not found: ID does not exist" containerID="90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.608662 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b"} err="failed to get container status \"90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b\": rpc error: code = NotFound desc = could not find container \"90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b\": container with ID starting with 90933beecd9c3bdd80aaa67b0eba27827b6c35709e716c04ee318d612502817b not found: ID does not exist" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.657060 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.657103 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-log-httpd\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.657130 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-run-httpd\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.657324 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-scripts\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.657416 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.657489 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-config-data\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.657587 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.657654 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cffp\" (UniqueName: \"kubernetes.io/projected/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-kube-api-access-6cffp\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.758760 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.758820 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cffp\" (UniqueName: \"kubernetes.io/projected/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-kube-api-access-6cffp\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.758883 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.758913 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-log-httpd\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.758943 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-run-httpd\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.759005 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-scripts\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.759054 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.759081 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-config-data\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.759764 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-run-httpd\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.759875 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-log-httpd\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.762370 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-config-data\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.763015 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.763919 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-scripts\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.767055 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.767753 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.773905 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cffp\" (UniqueName: \"kubernetes.io/projected/d94bbf84-e435-4a04-8aeb-e071d45a1dc5-kube-api-access-6cffp\") pod \"ceilometer-0\" (UID: \"d94bbf84-e435-4a04-8aeb-e071d45a1dc5\") " pod="openstack/ceilometer-0" Dec 03 07:03:31 crc kubenswrapper[4475]: I1203 07:03:31.895354 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:03:32 crc kubenswrapper[4475]: I1203 07:03:32.033549 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 07:03:32 crc kubenswrapper[4475]: I1203 07:03:32.034128 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 07:03:32 crc kubenswrapper[4475]: I1203 07:03:32.036670 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 07:03:32 crc kubenswrapper[4475]: I1203 07:03:32.045216 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 07:03:32 crc kubenswrapper[4475]: I1203 07:03:32.331888 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:03:32 crc kubenswrapper[4475]: W1203 07:03:32.333762 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd94bbf84_e435_4a04_8aeb_e071d45a1dc5.slice/crio-28cd0a6d206e8286cc85d3d0bd64f2c1a018d544a6f11fd1d4eddf1a7e1fe6fa WatchSource:0}: Error finding container 28cd0a6d206e8286cc85d3d0bd64f2c1a018d544a6f11fd1d4eddf1a7e1fe6fa: Status 404 returned error can't find the container with id 28cd0a6d206e8286cc85d3d0bd64f2c1a018d544a6f11fd1d4eddf1a7e1fe6fa Dec 03 07:03:32 crc kubenswrapper[4475]: I1203 07:03:32.509481 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d94bbf84-e435-4a04-8aeb-e071d45a1dc5","Type":"ContainerStarted","Data":"28cd0a6d206e8286cc85d3d0bd64f2c1a018d544a6f11fd1d4eddf1a7e1fe6fa"} Dec 03 07:03:32 crc kubenswrapper[4475]: I1203 07:03:32.510770 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 07:03:32 crc kubenswrapper[4475]: I1203 07:03:32.516524 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 07:03:33 crc kubenswrapper[4475]: I1203 07:03:33.499381 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f520204-42e9-45b7-ad44-a659f8c42b74" path="/var/lib/kubelet/pods/7f520204-42e9-45b7-ad44-a659f8c42b74/volumes" Dec 03 07:03:33 crc kubenswrapper[4475]: I1203 07:03:33.517832 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d94bbf84-e435-4a04-8aeb-e071d45a1dc5","Type":"ContainerStarted","Data":"420170cce12639efd166d7598478137cc46d6ae4b0281d51cae19f4bf2098a2b"} Dec 03 07:03:34 crc kubenswrapper[4475]: I1203 07:03:34.525549 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d94bbf84-e435-4a04-8aeb-e071d45a1dc5","Type":"ContainerStarted","Data":"713cde5d2283b3ff757b836b3ad663a2fe9c40c3a1f7411fb899bdbad25b16b6"} Dec 03 07:03:35 crc kubenswrapper[4475]: I1203 07:03:35.533717 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d94bbf84-e435-4a04-8aeb-e071d45a1dc5","Type":"ContainerStarted","Data":"5ebc555ee6b2e13710d3213e4d0be68aa8b77636f28d967af9aefbeb10deb48c"} Dec 03 07:03:36 crc kubenswrapper[4475]: I1203 07:03:36.543685 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d94bbf84-e435-4a04-8aeb-e071d45a1dc5","Type":"ContainerStarted","Data":"e528ecaa1d3cd963f11b1b885b7cdc54b4f81b642634191f4038f2bde2ec2340"} Dec 03 07:03:36 crc kubenswrapper[4475]: I1203 07:03:36.544092 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:03:36 crc kubenswrapper[4475]: I1203 07:03:36.571224 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9414009669999999 podStartE2EDuration="5.571207408s" podCreationTimestamp="2025-12-03 07:03:31 +0000 UTC" firstStartedPulling="2025-12-03 07:03:32.335776371 +0000 UTC m=+1097.140674705" lastFinishedPulling="2025-12-03 07:03:35.965582812 +0000 UTC m=+1100.770481146" observedRunningTime="2025-12-03 07:03:36.567623427 +0000 UTC m=+1101.372521761" watchObservedRunningTime="2025-12-03 07:03:36.571207408 +0000 UTC m=+1101.376105742" Dec 03 07:03:37 crc kubenswrapper[4475]: I1203 07:03:37.850656 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 07:04:01 crc kubenswrapper[4475]: I1203 07:04:01.903359 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 07:04:09 crc kubenswrapper[4475]: I1203 07:04:09.800170 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:04:10 crc kubenswrapper[4475]: I1203 07:04:10.632810 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:04:13 crc kubenswrapper[4475]: I1203 07:04:13.464728 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="386645cd-74e5-45bc-b3e4-0a326e5349f1" containerName="rabbitmq" containerID="cri-o://ea4744fd3cea2ebbc5ee016691ec2d3fafcd209271fae422863e376aa08ff040" gracePeriod=604797 Dec 03 07:04:14 crc kubenswrapper[4475]: I1203 07:04:14.404106 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="6447be14-8b0d-4514-a7c2-53da228c70c2" containerName="rabbitmq" containerID="cri-o://96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843" gracePeriod=604797 Dec 03 07:04:19 crc kubenswrapper[4475]: I1203 07:04:19.830166 4475 generic.go:334] "Generic (PLEG): container finished" podID="386645cd-74e5-45bc-b3e4-0a326e5349f1" containerID="ea4744fd3cea2ebbc5ee016691ec2d3fafcd209271fae422863e376aa08ff040" exitCode=0 Dec 03 07:04:19 crc kubenswrapper[4475]: I1203 07:04:19.831086 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"386645cd-74e5-45bc-b3e4-0a326e5349f1","Type":"ContainerDied","Data":"ea4744fd3cea2ebbc5ee016691ec2d3fafcd209271fae422863e376aa08ff040"} Dec 03 07:04:19 crc kubenswrapper[4475]: I1203 07:04:19.831122 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"386645cd-74e5-45bc-b3e4-0a326e5349f1","Type":"ContainerDied","Data":"d9276793d2a0033e80141eadec5bd4c55cf789534249ec3e081fe04916a19426"} Dec 03 07:04:19 crc kubenswrapper[4475]: I1203 07:04:19.831133 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9276793d2a0033e80141eadec5bd4c55cf789534249ec3e081fe04916a19426" Dec 03 07:04:19 crc kubenswrapper[4475]: I1203 07:04:19.843326 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.022318 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-plugins-conf\") pod \"386645cd-74e5-45bc-b3e4-0a326e5349f1\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.022370 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-erlang-cookie\") pod \"386645cd-74e5-45bc-b3e4-0a326e5349f1\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.022394 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtn6k\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-kube-api-access-dtn6k\") pod \"386645cd-74e5-45bc-b3e4-0a326e5349f1\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.022444 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/386645cd-74e5-45bc-b3e4-0a326e5349f1-pod-info\") pod \"386645cd-74e5-45bc-b3e4-0a326e5349f1\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.022566 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-tls\") pod \"386645cd-74e5-45bc-b3e4-0a326e5349f1\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.022595 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-config-data\") pod \"386645cd-74e5-45bc-b3e4-0a326e5349f1\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.022628 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/386645cd-74e5-45bc-b3e4-0a326e5349f1-erlang-cookie-secret\") pod \"386645cd-74e5-45bc-b3e4-0a326e5349f1\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.022648 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-plugins\") pod \"386645cd-74e5-45bc-b3e4-0a326e5349f1\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.022683 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-server-conf\") pod \"386645cd-74e5-45bc-b3e4-0a326e5349f1\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.022725 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-confd\") pod \"386645cd-74e5-45bc-b3e4-0a326e5349f1\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.022755 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"386645cd-74e5-45bc-b3e4-0a326e5349f1\" (UID: \"386645cd-74e5-45bc-b3e4-0a326e5349f1\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.026602 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "386645cd-74e5-45bc-b3e4-0a326e5349f1" (UID: "386645cd-74e5-45bc-b3e4-0a326e5349f1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.027295 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "386645cd-74e5-45bc-b3e4-0a326e5349f1" (UID: "386645cd-74e5-45bc-b3e4-0a326e5349f1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.027777 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "386645cd-74e5-45bc-b3e4-0a326e5349f1" (UID: "386645cd-74e5-45bc-b3e4-0a326e5349f1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.035022 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "386645cd-74e5-45bc-b3e4-0a326e5349f1" (UID: "386645cd-74e5-45bc-b3e4-0a326e5349f1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.049035 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "386645cd-74e5-45bc-b3e4-0a326e5349f1" (UID: "386645cd-74e5-45bc-b3e4-0a326e5349f1"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.049332 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/386645cd-74e5-45bc-b3e4-0a326e5349f1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "386645cd-74e5-45bc-b3e4-0a326e5349f1" (UID: "386645cd-74e5-45bc-b3e4-0a326e5349f1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.049369 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-kube-api-access-dtn6k" (OuterVolumeSpecName: "kube-api-access-dtn6k") pod "386645cd-74e5-45bc-b3e4-0a326e5349f1" (UID: "386645cd-74e5-45bc-b3e4-0a326e5349f1"). InnerVolumeSpecName "kube-api-access-dtn6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.049661 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/386645cd-74e5-45bc-b3e4-0a326e5349f1-pod-info" (OuterVolumeSpecName: "pod-info") pod "386645cd-74e5-45bc-b3e4-0a326e5349f1" (UID: "386645cd-74e5-45bc-b3e4-0a326e5349f1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.076047 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-config-data" (OuterVolumeSpecName: "config-data") pod "386645cd-74e5-45bc-b3e4-0a326e5349f1" (UID: "386645cd-74e5-45bc-b3e4-0a326e5349f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.114536 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c9f4ccf69-rd2jw"] Dec 03 07:04:20 crc kubenswrapper[4475]: E1203 07:04:20.114876 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386645cd-74e5-45bc-b3e4-0a326e5349f1" containerName="rabbitmq" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.114887 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="386645cd-74e5-45bc-b3e4-0a326e5349f1" containerName="rabbitmq" Dec 03 07:04:20 crc kubenswrapper[4475]: E1203 07:04:20.114904 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386645cd-74e5-45bc-b3e4-0a326e5349f1" containerName="setup-container" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.114909 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="386645cd-74e5-45bc-b3e4-0a326e5349f1" containerName="setup-container" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.115081 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="386645cd-74e5-45bc-b3e4-0a326e5349f1" containerName="rabbitmq" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.115959 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.120278 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.126900 4475 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.126922 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.126930 4475 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/386645cd-74e5-45bc-b3e4-0a326e5349f1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.126939 4475 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.126965 4475 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.126974 4475 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.126982 4475 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.126990 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtn6k\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-kube-api-access-dtn6k\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.126997 4475 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/386645cd-74e5-45bc-b3e4-0a326e5349f1-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.127612 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c9f4ccf69-rd2jw"] Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.135052 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-server-conf" (OuterVolumeSpecName: "server-conf") pod "386645cd-74e5-45bc-b3e4-0a326e5349f1" (UID: "386645cd-74e5-45bc-b3e4-0a326e5349f1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.155828 4475 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.197668 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "386645cd-74e5-45bc-b3e4-0a326e5349f1" (UID: "386645cd-74e5-45bc-b3e4-0a326e5349f1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.228552 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-config\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.228604 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-swift-storage-0\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.228634 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-nb\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.228728 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-sb\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.228825 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-svc\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.228851 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.229058 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdks\" (UniqueName: \"kubernetes.io/projected/28531a18-de2a-4747-aa7b-567eb681b435-kube-api-access-thdks\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.229269 4475 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.229329 4475 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/386645cd-74e5-45bc-b3e4-0a326e5349f1-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.229385 4475 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/386645cd-74e5-45bc-b3e4-0a326e5349f1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.331217 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-sb\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.331403 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-svc\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.331427 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.331573 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdks\" (UniqueName: \"kubernetes.io/projected/28531a18-de2a-4747-aa7b-567eb681b435-kube-api-access-thdks\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.331617 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-config\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.331654 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-swift-storage-0\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.331703 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-nb\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.332627 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-swift-storage-0\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.332643 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-config\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.332788 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-svc\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.332886 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-sb\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.333223 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.333400 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-nb\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.347088 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdks\" (UniqueName: \"kubernetes.io/projected/28531a18-de2a-4747-aa7b-567eb681b435-kube-api-access-thdks\") pod \"dnsmasq-dns-7c9f4ccf69-rd2jw\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.437153 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.827872 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.836993 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c9f4ccf69-rd2jw"] Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.839566 4475 generic.go:334] "Generic (PLEG): container finished" podID="6447be14-8b0d-4514-a7c2-53da228c70c2" containerID="96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843" exitCode=0 Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.839654 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.843538 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.843868 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6447be14-8b0d-4514-a7c2-53da228c70c2","Type":"ContainerDied","Data":"96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843"} Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.843902 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6447be14-8b0d-4514-a7c2-53da228c70c2","Type":"ContainerDied","Data":"6c51c5714877dda187663bf6897fc2521befecff47c42e29fac8a5564fc90959"} Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.843919 4475 scope.go:117] "RemoveContainer" containerID="96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.942529 4475 scope.go:117] "RemoveContainer" containerID="814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.944814 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"6447be14-8b0d-4514-a7c2-53da228c70c2\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.944873 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-plugins\") pod \"6447be14-8b0d-4514-a7c2-53da228c70c2\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.944940 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6447be14-8b0d-4514-a7c2-53da228c70c2-erlang-cookie-secret\") pod \"6447be14-8b0d-4514-a7c2-53da228c70c2\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.944966 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-server-conf\") pod \"6447be14-8b0d-4514-a7c2-53da228c70c2\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.944988 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6447be14-8b0d-4514-a7c2-53da228c70c2-pod-info\") pod \"6447be14-8b0d-4514-a7c2-53da228c70c2\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.945057 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-tls\") pod \"6447be14-8b0d-4514-a7c2-53da228c70c2\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.945080 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-confd\") pod \"6447be14-8b0d-4514-a7c2-53da228c70c2\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.945099 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-config-data\") pod \"6447be14-8b0d-4514-a7c2-53da228c70c2\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.945112 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-plugins-conf\") pod \"6447be14-8b0d-4514-a7c2-53da228c70c2\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.945153 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-erlang-cookie\") pod \"6447be14-8b0d-4514-a7c2-53da228c70c2\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.945171 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdhdr\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-kube-api-access-jdhdr\") pod \"6447be14-8b0d-4514-a7c2-53da228c70c2\" (UID: \"6447be14-8b0d-4514-a7c2-53da228c70c2\") " Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.948673 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6447be14-8b0d-4514-a7c2-53da228c70c2" (UID: "6447be14-8b0d-4514-a7c2-53da228c70c2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.950765 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6447be14-8b0d-4514-a7c2-53da228c70c2" (UID: "6447be14-8b0d-4514-a7c2-53da228c70c2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.952008 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6447be14-8b0d-4514-a7c2-53da228c70c2" (UID: "6447be14-8b0d-4514-a7c2-53da228c70c2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.951604 4475 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.960437 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6447be14-8b0d-4514-a7c2-53da228c70c2" (UID: "6447be14-8b0d-4514-a7c2-53da228c70c2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.962550 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "6447be14-8b0d-4514-a7c2-53da228c70c2" (UID: "6447be14-8b0d-4514-a7c2-53da228c70c2"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.963515 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6447be14-8b0d-4514-a7c2-53da228c70c2-pod-info" (OuterVolumeSpecName: "pod-info") pod "6447be14-8b0d-4514-a7c2-53da228c70c2" (UID: "6447be14-8b0d-4514-a7c2-53da228c70c2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.968809 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.973272 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-kube-api-access-jdhdr" (OuterVolumeSpecName: "kube-api-access-jdhdr") pod "6447be14-8b0d-4514-a7c2-53da228c70c2" (UID: "6447be14-8b0d-4514-a7c2-53da228c70c2"). InnerVolumeSpecName "kube-api-access-jdhdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.976565 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6447be14-8b0d-4514-a7c2-53da228c70c2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6447be14-8b0d-4514-a7c2-53da228c70c2" (UID: "6447be14-8b0d-4514-a7c2-53da228c70c2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.979899 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.988727 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:04:20 crc kubenswrapper[4475]: E1203 07:04:20.989083 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6447be14-8b0d-4514-a7c2-53da228c70c2" containerName="setup-container" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.989096 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6447be14-8b0d-4514-a7c2-53da228c70c2" containerName="setup-container" Dec 03 07:04:20 crc kubenswrapper[4475]: E1203 07:04:20.989128 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6447be14-8b0d-4514-a7c2-53da228c70c2" containerName="rabbitmq" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.989133 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6447be14-8b0d-4514-a7c2-53da228c70c2" containerName="rabbitmq" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.989311 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="6447be14-8b0d-4514-a7c2-53da228c70c2" containerName="rabbitmq" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.990283 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.995820 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.995992 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.996102 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.996272 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.996380 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.996592 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8rdmk" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.996930 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 07:04:20 crc kubenswrapper[4475]: I1203 07:04:20.998246 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.011029 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-config-data" (OuterVolumeSpecName: "config-data") pod "6447be14-8b0d-4514-a7c2-53da228c70c2" (UID: "6447be14-8b0d-4514-a7c2-53da228c70c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.033754 4475 scope.go:117] "RemoveContainer" containerID="96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843" Dec 03 07:04:21 crc kubenswrapper[4475]: E1203 07:04:21.037913 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843\": container with ID starting with 96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843 not found: ID does not exist" containerID="96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.037944 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843"} err="failed to get container status \"96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843\": rpc error: code = NotFound desc = could not find container \"96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843\": container with ID starting with 96852b33430a63bdead797314fa5d22c06e41066c7f9fd702bba19528ed39843 not found: ID does not exist" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.037965 4475 scope.go:117] "RemoveContainer" containerID="814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d" Dec 03 07:04:21 crc kubenswrapper[4475]: E1203 07:04:21.039377 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d\": container with ID starting with 814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d not found: ID does not exist" containerID="814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.039402 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d"} err="failed to get container status \"814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d\": rpc error: code = NotFound desc = could not find container \"814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d\": container with ID starting with 814640df107b4a181b7a3e64b7ebd3309204ef4f37192d2b3e5488422cc1410d not found: ID does not exist" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.055318 4475 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.055339 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdhdr\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-kube-api-access-jdhdr\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.055382 4475 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.055392 4475 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.055401 4475 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6447be14-8b0d-4514-a7c2-53da228c70c2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.055409 4475 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6447be14-8b0d-4514-a7c2-53da228c70c2-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.055416 4475 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.055423 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.055540 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-server-conf" (OuterVolumeSpecName: "server-conf") pod "6447be14-8b0d-4514-a7c2-53da228c70c2" (UID: "6447be14-8b0d-4514-a7c2-53da228c70c2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.077045 4475 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.121626 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6447be14-8b0d-4514-a7c2-53da228c70c2" (UID: "6447be14-8b0d-4514-a7c2-53da228c70c2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.156948 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.156985 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157037 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8xj8\" (UniqueName: \"kubernetes.io/projected/0ebb2dfe-acf2-4f92-adf4-02476810eb51-kube-api-access-p8xj8\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157251 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157321 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157403 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ebb2dfe-acf2-4f92-adf4-02476810eb51-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157426 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ebb2dfe-acf2-4f92-adf4-02476810eb51-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157469 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157542 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ebb2dfe-acf2-4f92-adf4-02476810eb51-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157560 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ebb2dfe-acf2-4f92-adf4-02476810eb51-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157599 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ebb2dfe-acf2-4f92-adf4-02476810eb51-config-data\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157665 4475 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157679 4475 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6447be14-8b0d-4514-a7c2-53da228c70c2-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.157689 4475 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6447be14-8b0d-4514-a7c2-53da228c70c2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.185589 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.197591 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.253416 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.254948 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.258813 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.258845 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.258880 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8xj8\" (UniqueName: \"kubernetes.io/projected/0ebb2dfe-acf2-4f92-adf4-02476810eb51-kube-api-access-p8xj8\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.258905 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.258955 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.259024 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ebb2dfe-acf2-4f92-adf4-02476810eb51-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.259046 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ebb2dfe-acf2-4f92-adf4-02476810eb51-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.259059 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.259113 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ebb2dfe-acf2-4f92-adf4-02476810eb51-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.259130 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ebb2dfe-acf2-4f92-adf4-02476810eb51-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.259165 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ebb2dfe-acf2-4f92-adf4-02476810eb51-config-data\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.259878 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ebb2dfe-acf2-4f92-adf4-02476810eb51-config-data\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.260013 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.263509 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ebb2dfe-acf2-4f92-adf4-02476810eb51-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.263947 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.264409 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.264658 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.265149 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.265305 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.265949 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.266288 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.266418 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.266424 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8m6hj" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.266505 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.268442 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ebb2dfe-acf2-4f92-adf4-02476810eb51-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.269265 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ebb2dfe-acf2-4f92-adf4-02476810eb51-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.269373 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.276003 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ebb2dfe-acf2-4f92-adf4-02476810eb51-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.287178 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8xj8\" (UniqueName: \"kubernetes.io/projected/0ebb2dfe-acf2-4f92-adf4-02476810eb51-kube-api-access-p8xj8\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.290409 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ebb2dfe-acf2-4f92-adf4-02476810eb51-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.312407 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"0ebb2dfe-acf2-4f92-adf4-02476810eb51\") " pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.337356 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.360077 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17e60604-8cc5-4052-a43f-f9c4bbea24cd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.360129 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.360146 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17e60604-8cc5-4052-a43f-f9c4bbea24cd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.360169 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17e60604-8cc5-4052-a43f-f9c4bbea24cd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.360195 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.360210 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.360231 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.360252 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e60604-8cc5-4052-a43f-f9c4bbea24cd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.360299 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.360323 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wznmn\" (UniqueName: \"kubernetes.io/projected/17e60604-8cc5-4052-a43f-f9c4bbea24cd-kube-api-access-wznmn\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.360411 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17e60604-8cc5-4052-a43f-f9c4bbea24cd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.462395 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.462475 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wznmn\" (UniqueName: \"kubernetes.io/projected/17e60604-8cc5-4052-a43f-f9c4bbea24cd-kube-api-access-wznmn\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.462534 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17e60604-8cc5-4052-a43f-f9c4bbea24cd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.462591 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17e60604-8cc5-4052-a43f-f9c4bbea24cd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.462630 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.462652 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17e60604-8cc5-4052-a43f-f9c4bbea24cd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.462672 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17e60604-8cc5-4052-a43f-f9c4bbea24cd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.462689 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.463001 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.463022 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.463051 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e60604-8cc5-4052-a43f-f9c4bbea24cd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.464018 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e60604-8cc5-4052-a43f-f9c4bbea24cd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.464288 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.465107 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17e60604-8cc5-4052-a43f-f9c4bbea24cd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.468149 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17e60604-8cc5-4052-a43f-f9c4bbea24cd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.468385 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.472142 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.472880 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17e60604-8cc5-4052-a43f-f9c4bbea24cd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.474244 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17e60604-8cc5-4052-a43f-f9c4bbea24cd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.474759 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.480567 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wznmn\" (UniqueName: \"kubernetes.io/projected/17e60604-8cc5-4052-a43f-f9c4bbea24cd-kube-api-access-wznmn\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.480654 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17e60604-8cc5-4052-a43f-f9c4bbea24cd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.494343 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17e60604-8cc5-4052-a43f-f9c4bbea24cd\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.503014 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386645cd-74e5-45bc-b3e4-0a326e5349f1" path="/var/lib/kubelet/pods/386645cd-74e5-45bc-b3e4-0a326e5349f1/volumes" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.503967 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6447be14-8b0d-4514-a7c2-53da228c70c2" path="/var/lib/kubelet/pods/6447be14-8b0d-4514-a7c2-53da228c70c2/volumes" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.594172 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.748408 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:04:21 crc kubenswrapper[4475]: W1203 07:04:21.750388 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ebb2dfe_acf2_4f92_adf4_02476810eb51.slice/crio-8a378d11cc2ce1f3505d9ebcbc020ef7d4442d21d9199261100860c2e58b5b3f WatchSource:0}: Error finding container 8a378d11cc2ce1f3505d9ebcbc020ef7d4442d21d9199261100860c2e58b5b3f: Status 404 returned error can't find the container with id 8a378d11cc2ce1f3505d9ebcbc020ef7d4442d21d9199261100860c2e58b5b3f Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.849478 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ebb2dfe-acf2-4f92-adf4-02476810eb51","Type":"ContainerStarted","Data":"8a378d11cc2ce1f3505d9ebcbc020ef7d4442d21d9199261100860c2e58b5b3f"} Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.850979 4475 generic.go:334] "Generic (PLEG): container finished" podID="28531a18-de2a-4747-aa7b-567eb681b435" containerID="af2ad4c831ffd06be9ae66f6dcd9c4949ede2597582776531eb316fc71a78483" exitCode=0 Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.851046 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" event={"ID":"28531a18-de2a-4747-aa7b-567eb681b435","Type":"ContainerDied","Data":"af2ad4c831ffd06be9ae66f6dcd9c4949ede2597582776531eb316fc71a78483"} Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.851075 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" event={"ID":"28531a18-de2a-4747-aa7b-567eb681b435","Type":"ContainerStarted","Data":"bd03754f2b97ce655cec06bdc23f3e093342f767be0843bee5aefa52aa4bcec2"} Dec 03 07:04:21 crc kubenswrapper[4475]: W1203 07:04:21.992902 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17e60604_8cc5_4052_a43f_f9c4bbea24cd.slice/crio-a305cb0fce3f5d3ce508baea16df69086a400af0fd88ace50e3ac267458c2640 WatchSource:0}: Error finding container a305cb0fce3f5d3ce508baea16df69086a400af0fd88ace50e3ac267458c2640: Status 404 returned error can't find the container with id a305cb0fce3f5d3ce508baea16df69086a400af0fd88ace50e3ac267458c2640 Dec 03 07:04:21 crc kubenswrapper[4475]: I1203 07:04:21.998238 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:04:22 crc kubenswrapper[4475]: I1203 07:04:22.870103 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" event={"ID":"28531a18-de2a-4747-aa7b-567eb681b435","Type":"ContainerStarted","Data":"6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37"} Dec 03 07:04:22 crc kubenswrapper[4475]: I1203 07:04:22.870428 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:22 crc kubenswrapper[4475]: I1203 07:04:22.871900 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17e60604-8cc5-4052-a43f-f9c4bbea24cd","Type":"ContainerStarted","Data":"a305cb0fce3f5d3ce508baea16df69086a400af0fd88ace50e3ac267458c2640"} Dec 03 07:04:22 crc kubenswrapper[4475]: I1203 07:04:22.885152 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" podStartSLOduration=2.885134817 podStartE2EDuration="2.885134817s" podCreationTimestamp="2025-12-03 07:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:04:22.884106542 +0000 UTC m=+1147.689004876" watchObservedRunningTime="2025-12-03 07:04:22.885134817 +0000 UTC m=+1147.690033151" Dec 03 07:04:23 crc kubenswrapper[4475]: I1203 07:04:23.880546 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17e60604-8cc5-4052-a43f-f9c4bbea24cd","Type":"ContainerStarted","Data":"fe9b2c71ae5b2223f337ecc985cfa68892f1ec8aee56b252221a3ceb0dcff97e"} Dec 03 07:04:23 crc kubenswrapper[4475]: I1203 07:04:23.882329 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ebb2dfe-acf2-4f92-adf4-02476810eb51","Type":"ContainerStarted","Data":"aead75449d8a2cb8a178375a57b5ee2bd74382690c3941bfa37ccdc6bc1eadf2"} Dec 03 07:04:25 crc kubenswrapper[4475]: I1203 07:04:25.776105 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="6447be14-8b0d-4514-a7c2-53da228c70c2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.94:5671: i/o timeout" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.438591 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.487293 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bcd77bb7-6bvxl"] Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.487554 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" podUID="327cade9-5417-4841-8984-eb98c916f0c1" containerName="dnsmasq-dns" containerID="cri-o://7d8962a7a160f903125c5e0d0964f3c4f8df7cc02ae8230371b00a875a168f36" gracePeriod=10 Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.673690 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68c8f576cf-ktt6x"] Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.674984 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.695072 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68c8f576cf-ktt6x"] Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.824549 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-ovsdbserver-nb\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.824785 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-ovsdbserver-sb\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.824833 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-openstack-edpm-ipam\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.824979 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-config\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.825083 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-dns-swift-storage-0\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.825129 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-dns-svc\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.825698 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7hvf\" (UniqueName: \"kubernetes.io/projected/86397ace-1547-40cd-86c5-d8ce79513490-kube-api-access-t7hvf\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.927472 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-dns-swift-storage-0\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.927510 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-dns-svc\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.927596 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7hvf\" (UniqueName: \"kubernetes.io/projected/86397ace-1547-40cd-86c5-d8ce79513490-kube-api-access-t7hvf\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.927720 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-ovsdbserver-nb\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.927754 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-ovsdbserver-sb\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.928232 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-dns-swift-storage-0\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.928305 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-dns-svc\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.928375 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-ovsdbserver-nb\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.928463 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-ovsdbserver-sb\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.928496 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-openstack-edpm-ipam\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.928566 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-config\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.929069 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-openstack-edpm-ipam\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.929415 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86397ace-1547-40cd-86c5-d8ce79513490-config\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.931054 4475 generic.go:334] "Generic (PLEG): container finished" podID="327cade9-5417-4841-8984-eb98c916f0c1" containerID="7d8962a7a160f903125c5e0d0964f3c4f8df7cc02ae8230371b00a875a168f36" exitCode=0 Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.931078 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" event={"ID":"327cade9-5417-4841-8984-eb98c916f0c1","Type":"ContainerDied","Data":"7d8962a7a160f903125c5e0d0964f3c4f8df7cc02ae8230371b00a875a168f36"} Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.931114 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" event={"ID":"327cade9-5417-4841-8984-eb98c916f0c1","Type":"ContainerDied","Data":"fbe464ee9522c2ec52ae8a51376ce558ee38db66045734d02b4bc32d2adf6324"} Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.931125 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbe464ee9522c2ec52ae8a51376ce558ee38db66045734d02b4bc32d2adf6324" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.938721 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.945419 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7hvf\" (UniqueName: \"kubernetes.io/projected/86397ace-1547-40cd-86c5-d8ce79513490-kube-api-access-t7hvf\") pod \"dnsmasq-dns-68c8f576cf-ktt6x\" (UID: \"86397ace-1547-40cd-86c5-d8ce79513490\") " pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:30 crc kubenswrapper[4475]: I1203 07:04:30.990289 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.029484 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-sb\") pod \"327cade9-5417-4841-8984-eb98c916f0c1\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.029523 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-swift-storage-0\") pod \"327cade9-5417-4841-8984-eb98c916f0c1\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.029613 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-nb\") pod \"327cade9-5417-4841-8984-eb98c916f0c1\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.029678 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p576g\" (UniqueName: \"kubernetes.io/projected/327cade9-5417-4841-8984-eb98c916f0c1-kube-api-access-p576g\") pod \"327cade9-5417-4841-8984-eb98c916f0c1\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.029714 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-config\") pod \"327cade9-5417-4841-8984-eb98c916f0c1\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.029814 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-svc\") pod \"327cade9-5417-4841-8984-eb98c916f0c1\" (UID: \"327cade9-5417-4841-8984-eb98c916f0c1\") " Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.037588 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/327cade9-5417-4841-8984-eb98c916f0c1-kube-api-access-p576g" (OuterVolumeSpecName: "kube-api-access-p576g") pod "327cade9-5417-4841-8984-eb98c916f0c1" (UID: "327cade9-5417-4841-8984-eb98c916f0c1"). InnerVolumeSpecName "kube-api-access-p576g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.074716 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "327cade9-5417-4841-8984-eb98c916f0c1" (UID: "327cade9-5417-4841-8984-eb98c916f0c1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.091271 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "327cade9-5417-4841-8984-eb98c916f0c1" (UID: "327cade9-5417-4841-8984-eb98c916f0c1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.093097 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "327cade9-5417-4841-8984-eb98c916f0c1" (UID: "327cade9-5417-4841-8984-eb98c916f0c1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.094329 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "327cade9-5417-4841-8984-eb98c916f0c1" (UID: "327cade9-5417-4841-8984-eb98c916f0c1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.108814 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-config" (OuterVolumeSpecName: "config") pod "327cade9-5417-4841-8984-eb98c916f0c1" (UID: "327cade9-5417-4841-8984-eb98c916f0c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.131417 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.131442 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.131468 4475 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.131478 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.131487 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p576g\" (UniqueName: \"kubernetes.io/projected/327cade9-5417-4841-8984-eb98c916f0c1-kube-api-access-p576g\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.131495 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/327cade9-5417-4841-8984-eb98c916f0c1-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.401585 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68c8f576cf-ktt6x"] Dec 03 07:04:31 crc kubenswrapper[4475]: W1203 07:04:31.403874 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86397ace_1547_40cd_86c5_d8ce79513490.slice/crio-27bde263a8b15a21e769d4b612f020a551a65753dbb9f86dfbca1d1303ef17d6 WatchSource:0}: Error finding container 27bde263a8b15a21e769d4b612f020a551a65753dbb9f86dfbca1d1303ef17d6: Status 404 returned error can't find the container with id 27bde263a8b15a21e769d4b612f020a551a65753dbb9f86dfbca1d1303ef17d6 Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.938338 4475 generic.go:334] "Generic (PLEG): container finished" podID="86397ace-1547-40cd-86c5-d8ce79513490" containerID="77064fc234757dbf46082ad5918f436265a911806675dcbb06e941a3f41e0bc8" exitCode=0 Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.938383 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" event={"ID":"86397ace-1547-40cd-86c5-d8ce79513490","Type":"ContainerDied","Data":"77064fc234757dbf46082ad5918f436265a911806675dcbb06e941a3f41e0bc8"} Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.938583 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" event={"ID":"86397ace-1547-40cd-86c5-d8ce79513490","Type":"ContainerStarted","Data":"27bde263a8b15a21e769d4b612f020a551a65753dbb9f86dfbca1d1303ef17d6"} Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.938605 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bcd77bb7-6bvxl" Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.978397 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bcd77bb7-6bvxl"] Dec 03 07:04:31 crc kubenswrapper[4475]: I1203 07:04:31.985517 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57bcd77bb7-6bvxl"] Dec 03 07:04:32 crc kubenswrapper[4475]: I1203 07:04:32.946888 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" event={"ID":"86397ace-1547-40cd-86c5-d8ce79513490","Type":"ContainerStarted","Data":"ade07975616540ce89c8a7f0c06328e3c0565f84a21e684a8360bf5a370a9926"} Dec 03 07:04:32 crc kubenswrapper[4475]: I1203 07:04:32.947578 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:32 crc kubenswrapper[4475]: I1203 07:04:32.966694 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" podStartSLOduration=2.966678173 podStartE2EDuration="2.966678173s" podCreationTimestamp="2025-12-03 07:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:04:32.963581869 +0000 UTC m=+1157.768480203" watchObservedRunningTime="2025-12-03 07:04:32.966678173 +0000 UTC m=+1157.771576507" Dec 03 07:04:33 crc kubenswrapper[4475]: I1203 07:04:33.500100 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="327cade9-5417-4841-8984-eb98c916f0c1" path="/var/lib/kubelet/pods/327cade9-5417-4841-8984-eb98c916f0c1/volumes" Dec 03 07:04:40 crc kubenswrapper[4475]: I1203 07:04:40.991571 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68c8f576cf-ktt6x" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.032766 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c9f4ccf69-rd2jw"] Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.033193 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" podUID="28531a18-de2a-4747-aa7b-567eb681b435" containerName="dnsmasq-dns" containerID="cri-o://6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37" gracePeriod=10 Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.563344 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.711100 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-swift-storage-0\") pod \"28531a18-de2a-4747-aa7b-567eb681b435\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.711213 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thdks\" (UniqueName: \"kubernetes.io/projected/28531a18-de2a-4747-aa7b-567eb681b435-kube-api-access-thdks\") pod \"28531a18-de2a-4747-aa7b-567eb681b435\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.711570 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-svc\") pod \"28531a18-de2a-4747-aa7b-567eb681b435\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.711598 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-openstack-edpm-ipam\") pod \"28531a18-de2a-4747-aa7b-567eb681b435\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.711658 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-sb\") pod \"28531a18-de2a-4747-aa7b-567eb681b435\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.711710 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-config\") pod \"28531a18-de2a-4747-aa7b-567eb681b435\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.711803 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-nb\") pod \"28531a18-de2a-4747-aa7b-567eb681b435\" (UID: \"28531a18-de2a-4747-aa7b-567eb681b435\") " Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.716804 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28531a18-de2a-4747-aa7b-567eb681b435-kube-api-access-thdks" (OuterVolumeSpecName: "kube-api-access-thdks") pod "28531a18-de2a-4747-aa7b-567eb681b435" (UID: "28531a18-de2a-4747-aa7b-567eb681b435"). InnerVolumeSpecName "kube-api-access-thdks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.752412 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "28531a18-de2a-4747-aa7b-567eb681b435" (UID: "28531a18-de2a-4747-aa7b-567eb681b435"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.755828 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "28531a18-de2a-4747-aa7b-567eb681b435" (UID: "28531a18-de2a-4747-aa7b-567eb681b435"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.762279 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-config" (OuterVolumeSpecName: "config") pod "28531a18-de2a-4747-aa7b-567eb681b435" (UID: "28531a18-de2a-4747-aa7b-567eb681b435"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.762352 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "28531a18-de2a-4747-aa7b-567eb681b435" (UID: "28531a18-de2a-4747-aa7b-567eb681b435"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.762894 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "28531a18-de2a-4747-aa7b-567eb681b435" (UID: "28531a18-de2a-4747-aa7b-567eb681b435"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.764648 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "28531a18-de2a-4747-aa7b-567eb681b435" (UID: "28531a18-de2a-4747-aa7b-567eb681b435"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.814625 4475 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.814653 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thdks\" (UniqueName: \"kubernetes.io/projected/28531a18-de2a-4747-aa7b-567eb681b435-kube-api-access-thdks\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.814665 4475 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.814673 4475 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.814681 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.814688 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:41 crc kubenswrapper[4475]: I1203 07:04:41.814695 4475 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28531a18-de2a-4747-aa7b-567eb681b435-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.009738 4475 generic.go:334] "Generic (PLEG): container finished" podID="28531a18-de2a-4747-aa7b-567eb681b435" containerID="6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37" exitCode=0 Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.009783 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" event={"ID":"28531a18-de2a-4747-aa7b-567eb681b435","Type":"ContainerDied","Data":"6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37"} Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.009788 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.009820 4475 scope.go:117] "RemoveContainer" containerID="6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37" Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.009810 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9f4ccf69-rd2jw" event={"ID":"28531a18-de2a-4747-aa7b-567eb681b435","Type":"ContainerDied","Data":"bd03754f2b97ce655cec06bdc23f3e093342f767be0843bee5aefa52aa4bcec2"} Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.026839 4475 scope.go:117] "RemoveContainer" containerID="af2ad4c831ffd06be9ae66f6dcd9c4949ede2597582776531eb316fc71a78483" Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.043462 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c9f4ccf69-rd2jw"] Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.053798 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c9f4ccf69-rd2jw"] Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.064365 4475 scope.go:117] "RemoveContainer" containerID="6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37" Dec 03 07:04:42 crc kubenswrapper[4475]: E1203 07:04:42.064664 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37\": container with ID starting with 6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37 not found: ID does not exist" containerID="6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37" Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.064691 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37"} err="failed to get container status \"6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37\": rpc error: code = NotFound desc = could not find container \"6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37\": container with ID starting with 6616bbc29d07a9aef5574879b1e45e645f551ef43a7141b5727bd8aad3ef7c37 not found: ID does not exist" Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.064709 4475 scope.go:117] "RemoveContainer" containerID="af2ad4c831ffd06be9ae66f6dcd9c4949ede2597582776531eb316fc71a78483" Dec 03 07:04:42 crc kubenswrapper[4475]: E1203 07:04:42.065766 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af2ad4c831ffd06be9ae66f6dcd9c4949ede2597582776531eb316fc71a78483\": container with ID starting with af2ad4c831ffd06be9ae66f6dcd9c4949ede2597582776531eb316fc71a78483 not found: ID does not exist" containerID="af2ad4c831ffd06be9ae66f6dcd9c4949ede2597582776531eb316fc71a78483" Dec 03 07:04:42 crc kubenswrapper[4475]: I1203 07:04:42.065799 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af2ad4c831ffd06be9ae66f6dcd9c4949ede2597582776531eb316fc71a78483"} err="failed to get container status \"af2ad4c831ffd06be9ae66f6dcd9c4949ede2597582776531eb316fc71a78483\": rpc error: code = NotFound desc = could not find container \"af2ad4c831ffd06be9ae66f6dcd9c4949ede2597582776531eb316fc71a78483\": container with ID starting with af2ad4c831ffd06be9ae66f6dcd9c4949ede2597582776531eb316fc71a78483 not found: ID does not exist" Dec 03 07:04:43 crc kubenswrapper[4475]: I1203 07:04:43.498509 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28531a18-de2a-4747-aa7b-567eb681b435" path="/var/lib/kubelet/pods/28531a18-de2a-4747-aa7b-567eb681b435/volumes" Dec 03 07:04:55 crc kubenswrapper[4475]: I1203 07:04:55.093024 4475 generic.go:334] "Generic (PLEG): container finished" podID="17e60604-8cc5-4052-a43f-f9c4bbea24cd" containerID="fe9b2c71ae5b2223f337ecc985cfa68892f1ec8aee56b252221a3ceb0dcff97e" exitCode=0 Dec 03 07:04:55 crc kubenswrapper[4475]: I1203 07:04:55.093103 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17e60604-8cc5-4052-a43f-f9c4bbea24cd","Type":"ContainerDied","Data":"fe9b2c71ae5b2223f337ecc985cfa68892f1ec8aee56b252221a3ceb0dcff97e"} Dec 03 07:04:55 crc kubenswrapper[4475]: I1203 07:04:55.095607 4475 generic.go:334] "Generic (PLEG): container finished" podID="0ebb2dfe-acf2-4f92-adf4-02476810eb51" containerID="aead75449d8a2cb8a178375a57b5ee2bd74382690c3941bfa37ccdc6bc1eadf2" exitCode=0 Dec 03 07:04:55 crc kubenswrapper[4475]: I1203 07:04:55.095644 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ebb2dfe-acf2-4f92-adf4-02476810eb51","Type":"ContainerDied","Data":"aead75449d8a2cb8a178375a57b5ee2bd74382690c3941bfa37ccdc6bc1eadf2"} Dec 03 07:04:56 crc kubenswrapper[4475]: I1203 07:04:56.108259 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ebb2dfe-acf2-4f92-adf4-02476810eb51","Type":"ContainerStarted","Data":"0c6eb22cb6cf7ed3c606b783215e73b6154559e54d6676203501b82428c7d4e8"} Dec 03 07:04:56 crc kubenswrapper[4475]: I1203 07:04:56.110052 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 07:04:56 crc kubenswrapper[4475]: I1203 07:04:56.115186 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17e60604-8cc5-4052-a43f-f9c4bbea24cd","Type":"ContainerStarted","Data":"bc62ccfc37a2e4b61aa76f1c934da161f80010d8bdaf340bdb0cd81023a0988e"} Dec 03 07:04:56 crc kubenswrapper[4475]: I1203 07:04:56.116240 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:04:56 crc kubenswrapper[4475]: I1203 07:04:56.145176 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.145149424 podStartE2EDuration="36.145149424s" podCreationTimestamp="2025-12-03 07:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:04:56.134660884 +0000 UTC m=+1180.939559238" watchObservedRunningTime="2025-12-03 07:04:56.145149424 +0000 UTC m=+1180.950047758" Dec 03 07:04:56 crc kubenswrapper[4475]: I1203 07:04:56.157464 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.157436967 podStartE2EDuration="35.157436967s" podCreationTimestamp="2025-12-03 07:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:04:56.155250915 +0000 UTC m=+1180.960149249" watchObservedRunningTime="2025-12-03 07:04:56.157436967 +0000 UTC m=+1180.962335300" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.029778 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f"] Dec 03 07:04:59 crc kubenswrapper[4475]: E1203 07:04:59.030303 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28531a18-de2a-4747-aa7b-567eb681b435" containerName="init" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.030315 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="28531a18-de2a-4747-aa7b-567eb681b435" containerName="init" Dec 03 07:04:59 crc kubenswrapper[4475]: E1203 07:04:59.030327 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327cade9-5417-4841-8984-eb98c916f0c1" containerName="init" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.030332 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="327cade9-5417-4841-8984-eb98c916f0c1" containerName="init" Dec 03 07:04:59 crc kubenswrapper[4475]: E1203 07:04:59.030350 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28531a18-de2a-4747-aa7b-567eb681b435" containerName="dnsmasq-dns" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.030356 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="28531a18-de2a-4747-aa7b-567eb681b435" containerName="dnsmasq-dns" Dec 03 07:04:59 crc kubenswrapper[4475]: E1203 07:04:59.030370 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327cade9-5417-4841-8984-eb98c916f0c1" containerName="dnsmasq-dns" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.030375 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="327cade9-5417-4841-8984-eb98c916f0c1" containerName="dnsmasq-dns" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.030550 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="327cade9-5417-4841-8984-eb98c916f0c1" containerName="dnsmasq-dns" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.030572 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="28531a18-de2a-4747-aa7b-567eb681b435" containerName="dnsmasq-dns" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.031095 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.033063 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.033804 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.033822 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.034141 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.046113 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f"] Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.113896 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.114134 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.114171 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.114628 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nshbj\" (UniqueName: \"kubernetes.io/projected/619f88e0-70f2-42b6-b174-f57e6447f803-kube-api-access-nshbj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.216554 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.216598 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.216699 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nshbj\" (UniqueName: \"kubernetes.io/projected/619f88e0-70f2-42b6-b174-f57e6447f803-kube-api-access-nshbj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.216766 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.221921 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.222814 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.227988 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.232332 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nshbj\" (UniqueName: \"kubernetes.io/projected/619f88e0-70f2-42b6-b174-f57e6447f803-kube-api-access-nshbj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.350866 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.800898 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f"] Dec 03 07:04:59 crc kubenswrapper[4475]: I1203 07:04:59.815919 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:05:00 crc kubenswrapper[4475]: I1203 07:05:00.140603 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" event={"ID":"619f88e0-70f2-42b6-b174-f57e6447f803","Type":"ContainerStarted","Data":"0d877253163112d47585e94a2128a0efe497be66603d675615ef2b1a4e11e2be"} Dec 03 07:05:08 crc kubenswrapper[4475]: I1203 07:05:08.198269 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" event={"ID":"619f88e0-70f2-42b6-b174-f57e6447f803","Type":"ContainerStarted","Data":"c6a2c3ded8be48b029221cd424c83d4b7c97a41639f046fc9d53b1c002dda1f7"} Dec 03 07:05:11 crc kubenswrapper[4475]: I1203 07:05:11.339619 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 07:05:11 crc kubenswrapper[4475]: I1203 07:05:11.360771 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" podStartSLOduration=4.403631041 podStartE2EDuration="12.360753977s" podCreationTimestamp="2025-12-03 07:04:59 +0000 UTC" firstStartedPulling="2025-12-03 07:04:59.815683841 +0000 UTC m=+1184.620582175" lastFinishedPulling="2025-12-03 07:05:07.772806777 +0000 UTC m=+1192.577705111" observedRunningTime="2025-12-03 07:05:08.218717241 +0000 UTC m=+1193.023615575" watchObservedRunningTime="2025-12-03 07:05:11.360753977 +0000 UTC m=+1196.165652311" Dec 03 07:05:11 crc kubenswrapper[4475]: I1203 07:05:11.596607 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:05:19 crc kubenswrapper[4475]: I1203 07:05:19.280097 4475 generic.go:334] "Generic (PLEG): container finished" podID="619f88e0-70f2-42b6-b174-f57e6447f803" containerID="c6a2c3ded8be48b029221cd424c83d4b7c97a41639f046fc9d53b1c002dda1f7" exitCode=0 Dec 03 07:05:19 crc kubenswrapper[4475]: I1203 07:05:19.280154 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" event={"ID":"619f88e0-70f2-42b6-b174-f57e6447f803","Type":"ContainerDied","Data":"c6a2c3ded8be48b029221cd424c83d4b7c97a41639f046fc9d53b1c002dda1f7"} Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.611354 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.785389 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-inventory\") pod \"619f88e0-70f2-42b6-b174-f57e6447f803\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.785425 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-ssh-key\") pod \"619f88e0-70f2-42b6-b174-f57e6447f803\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.785475 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-repo-setup-combined-ca-bundle\") pod \"619f88e0-70f2-42b6-b174-f57e6447f803\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.785532 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nshbj\" (UniqueName: \"kubernetes.io/projected/619f88e0-70f2-42b6-b174-f57e6447f803-kube-api-access-nshbj\") pod \"619f88e0-70f2-42b6-b174-f57e6447f803\" (UID: \"619f88e0-70f2-42b6-b174-f57e6447f803\") " Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.790475 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619f88e0-70f2-42b6-b174-f57e6447f803-kube-api-access-nshbj" (OuterVolumeSpecName: "kube-api-access-nshbj") pod "619f88e0-70f2-42b6-b174-f57e6447f803" (UID: "619f88e0-70f2-42b6-b174-f57e6447f803"). InnerVolumeSpecName "kube-api-access-nshbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.796492 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "619f88e0-70f2-42b6-b174-f57e6447f803" (UID: "619f88e0-70f2-42b6-b174-f57e6447f803"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.807034 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-inventory" (OuterVolumeSpecName: "inventory") pod "619f88e0-70f2-42b6-b174-f57e6447f803" (UID: "619f88e0-70f2-42b6-b174-f57e6447f803"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.807060 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "619f88e0-70f2-42b6-b174-f57e6447f803" (UID: "619f88e0-70f2-42b6-b174-f57e6447f803"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.888014 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.888041 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.888051 4475 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619f88e0-70f2-42b6-b174-f57e6447f803-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:05:20 crc kubenswrapper[4475]: I1203 07:05:20.888061 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nshbj\" (UniqueName: \"kubernetes.io/projected/619f88e0-70f2-42b6-b174-f57e6447f803-kube-api-access-nshbj\") on node \"crc\" DevicePath \"\"" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.297537 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" event={"ID":"619f88e0-70f2-42b6-b174-f57e6447f803","Type":"ContainerDied","Data":"0d877253163112d47585e94a2128a0efe497be66603d675615ef2b1a4e11e2be"} Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.297751 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d877253163112d47585e94a2128a0efe497be66603d675615ef2b1a4e11e2be" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.297758 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqk8f" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.425369 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv"] Dec 03 07:05:21 crc kubenswrapper[4475]: E1203 07:05:21.425786 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619f88e0-70f2-42b6-b174-f57e6447f803" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.425805 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="619f88e0-70f2-42b6-b174-f57e6447f803" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.425995 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="619f88e0-70f2-42b6-b174-f57e6447f803" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.426610 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.427916 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.428585 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.428757 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.428949 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.435773 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv"] Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.497366 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zxrv\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.497479 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtsmv\" (UniqueName: \"kubernetes.io/projected/29ac02ae-85b8-43df-a82e-7dca13e5a967-kube-api-access-mtsmv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zxrv\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.497580 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zxrv\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.598555 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtsmv\" (UniqueName: \"kubernetes.io/projected/29ac02ae-85b8-43df-a82e-7dca13e5a967-kube-api-access-mtsmv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zxrv\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.598674 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zxrv\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.598734 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zxrv\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.602105 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zxrv\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.602195 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zxrv\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.612324 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtsmv\" (UniqueName: \"kubernetes.io/projected/29ac02ae-85b8-43df-a82e-7dca13e5a967-kube-api-access-mtsmv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zxrv\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:21 crc kubenswrapper[4475]: I1203 07:05:21.742745 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:22 crc kubenswrapper[4475]: I1203 07:05:22.164312 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv"] Dec 03 07:05:22 crc kubenswrapper[4475]: I1203 07:05:22.305357 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" event={"ID":"29ac02ae-85b8-43df-a82e-7dca13e5a967","Type":"ContainerStarted","Data":"7a9b5ebf82fce1117b0f1e3a9b050d8a03f2e68614e1ed877ff4c80404ef56d5"} Dec 03 07:05:23 crc kubenswrapper[4475]: I1203 07:05:23.316401 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" event={"ID":"29ac02ae-85b8-43df-a82e-7dca13e5a967","Type":"ContainerStarted","Data":"656ba103e21488a20a5ea84455a2086a187f78df64ed876645849591d4767d8d"} Dec 03 07:05:23 crc kubenswrapper[4475]: I1203 07:05:23.329981 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" podStartSLOduration=1.775118304 podStartE2EDuration="2.329965922s" podCreationTimestamp="2025-12-03 07:05:21 +0000 UTC" firstStartedPulling="2025-12-03 07:05:22.171041884 +0000 UTC m=+1206.975940218" lastFinishedPulling="2025-12-03 07:05:22.725889502 +0000 UTC m=+1207.530787836" observedRunningTime="2025-12-03 07:05:23.327128985 +0000 UTC m=+1208.132027318" watchObservedRunningTime="2025-12-03 07:05:23.329965922 +0000 UTC m=+1208.134864255" Dec 03 07:05:25 crc kubenswrapper[4475]: I1203 07:05:25.335200 4475 generic.go:334] "Generic (PLEG): container finished" podID="29ac02ae-85b8-43df-a82e-7dca13e5a967" containerID="656ba103e21488a20a5ea84455a2086a187f78df64ed876645849591d4767d8d" exitCode=0 Dec 03 07:05:25 crc kubenswrapper[4475]: I1203 07:05:25.335440 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" event={"ID":"29ac02ae-85b8-43df-a82e-7dca13e5a967","Type":"ContainerDied","Data":"656ba103e21488a20a5ea84455a2086a187f78df64ed876645849591d4767d8d"} Dec 03 07:05:26 crc kubenswrapper[4475]: I1203 07:05:26.634048 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:26 crc kubenswrapper[4475]: I1203 07:05:26.783871 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-inventory\") pod \"29ac02ae-85b8-43df-a82e-7dca13e5a967\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " Dec 03 07:05:26 crc kubenswrapper[4475]: I1203 07:05:26.783918 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtsmv\" (UniqueName: \"kubernetes.io/projected/29ac02ae-85b8-43df-a82e-7dca13e5a967-kube-api-access-mtsmv\") pod \"29ac02ae-85b8-43df-a82e-7dca13e5a967\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " Dec 03 07:05:26 crc kubenswrapper[4475]: I1203 07:05:26.783946 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-ssh-key\") pod \"29ac02ae-85b8-43df-a82e-7dca13e5a967\" (UID: \"29ac02ae-85b8-43df-a82e-7dca13e5a967\") " Dec 03 07:05:26 crc kubenswrapper[4475]: I1203 07:05:26.789529 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ac02ae-85b8-43df-a82e-7dca13e5a967-kube-api-access-mtsmv" (OuterVolumeSpecName: "kube-api-access-mtsmv") pod "29ac02ae-85b8-43df-a82e-7dca13e5a967" (UID: "29ac02ae-85b8-43df-a82e-7dca13e5a967"). InnerVolumeSpecName "kube-api-access-mtsmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:05:26 crc kubenswrapper[4475]: I1203 07:05:26.806820 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-inventory" (OuterVolumeSpecName: "inventory") pod "29ac02ae-85b8-43df-a82e-7dca13e5a967" (UID: "29ac02ae-85b8-43df-a82e-7dca13e5a967"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:05:26 crc kubenswrapper[4475]: I1203 07:05:26.807545 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29ac02ae-85b8-43df-a82e-7dca13e5a967" (UID: "29ac02ae-85b8-43df-a82e-7dca13e5a967"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:05:26 crc kubenswrapper[4475]: I1203 07:05:26.885757 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:05:26 crc kubenswrapper[4475]: I1203 07:05:26.885909 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtsmv\" (UniqueName: \"kubernetes.io/projected/29ac02ae-85b8-43df-a82e-7dca13e5a967-kube-api-access-mtsmv\") on node \"crc\" DevicePath \"\"" Dec 03 07:05:26 crc kubenswrapper[4475]: I1203 07:05:26.885967 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29ac02ae-85b8-43df-a82e-7dca13e5a967-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.100975 4475 scope.go:117] "RemoveContainer" containerID="ee598506c3416b4517e59d2ed9d555835bc5aaa108285dc206bf5b914fcfd214" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.123559 4475 scope.go:117] "RemoveContainer" containerID="ea4744fd3cea2ebbc5ee016691ec2d3fafcd209271fae422863e376aa08ff040" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.160443 4475 scope.go:117] "RemoveContainer" containerID="36757543935b64ce2011e8269e6c8c33fcefe33820f887496e8233a2cbb685bf" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.351364 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" event={"ID":"29ac02ae-85b8-43df-a82e-7dca13e5a967","Type":"ContainerDied","Data":"7a9b5ebf82fce1117b0f1e3a9b050d8a03f2e68614e1ed877ff4c80404ef56d5"} Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.351398 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zxrv" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.351404 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a9b5ebf82fce1117b0f1e3a9b050d8a03f2e68614e1ed877ff4c80404ef56d5" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.402024 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs"] Dec 03 07:05:27 crc kubenswrapper[4475]: E1203 07:05:27.402381 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ac02ae-85b8-43df-a82e-7dca13e5a967" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.402400 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ac02ae-85b8-43df-a82e-7dca13e5a967" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.402603 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ac02ae-85b8-43df-a82e-7dca13e5a967" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.403134 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.406485 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.406669 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.408553 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.408665 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.414438 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs"] Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.495598 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlwfx\" (UniqueName: \"kubernetes.io/projected/a68effcf-167a-423d-9998-3f0cbc1f36a9-kube-api-access-rlwfx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.495673 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.495721 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.495807 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.597338 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.597581 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.597718 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlwfx\" (UniqueName: \"kubernetes.io/projected/a68effcf-167a-423d-9998-3f0cbc1f36a9-kube-api-access-rlwfx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.597865 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.600852 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.601139 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.602741 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.611932 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlwfx\" (UniqueName: \"kubernetes.io/projected/a68effcf-167a-423d-9998-3f0cbc1f36a9-kube-api-access-rlwfx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:27 crc kubenswrapper[4475]: I1203 07:05:27.730176 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:05:28 crc kubenswrapper[4475]: I1203 07:05:28.140856 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs"] Dec 03 07:05:28 crc kubenswrapper[4475]: I1203 07:05:28.358944 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" event={"ID":"a68effcf-167a-423d-9998-3f0cbc1f36a9","Type":"ContainerStarted","Data":"bfbe5b6cde0ec893823b81b35476c945a319886cdee27282ea3011b7843a30b3"} Dec 03 07:05:28 crc kubenswrapper[4475]: I1203 07:05:28.933093 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:05:28 crc kubenswrapper[4475]: I1203 07:05:28.933318 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:05:29 crc kubenswrapper[4475]: I1203 07:05:29.366431 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" event={"ID":"a68effcf-167a-423d-9998-3f0cbc1f36a9","Type":"ContainerStarted","Data":"64cd99427d1bfb8085729b95f49242283542467318673b56ae5caf697fa931db"} Dec 03 07:05:29 crc kubenswrapper[4475]: I1203 07:05:29.378821 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" podStartSLOduration=1.8911767689999999 podStartE2EDuration="2.378810185s" podCreationTimestamp="2025-12-03 07:05:27 +0000 UTC" firstStartedPulling="2025-12-03 07:05:28.148682931 +0000 UTC m=+1212.953581265" lastFinishedPulling="2025-12-03 07:05:28.636316347 +0000 UTC m=+1213.441214681" observedRunningTime="2025-12-03 07:05:29.377006042 +0000 UTC m=+1214.181904376" watchObservedRunningTime="2025-12-03 07:05:29.378810185 +0000 UTC m=+1214.183708520" Dec 03 07:05:58 crc kubenswrapper[4475]: I1203 07:05:58.933526 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:05:58 crc kubenswrapper[4475]: I1203 07:05:58.933873 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:06:27 crc kubenswrapper[4475]: I1203 07:06:27.234919 4475 scope.go:117] "RemoveContainer" containerID="cd64158ca58245973db7aeaca00a78a4ddf0872adbe2a1047c5ce26228debe0d" Dec 03 07:06:28 crc kubenswrapper[4475]: I1203 07:06:28.933004 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:06:28 crc kubenswrapper[4475]: I1203 07:06:28.933221 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:06:28 crc kubenswrapper[4475]: I1203 07:06:28.933266 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 07:06:28 crc kubenswrapper[4475]: I1203 07:06:28.933822 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1464654d0e3e46198ab49244bf31d5b6b7a77079e850cf4c97ff1472e570dfc1"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:06:28 crc kubenswrapper[4475]: I1203 07:06:28.933869 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://1464654d0e3e46198ab49244bf31d5b6b7a77079e850cf4c97ff1472e570dfc1" gracePeriod=600 Dec 03 07:06:29 crc kubenswrapper[4475]: I1203 07:06:29.759617 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="1464654d0e3e46198ab49244bf31d5b6b7a77079e850cf4c97ff1472e570dfc1" exitCode=0 Dec 03 07:06:29 crc kubenswrapper[4475]: I1203 07:06:29.759690 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"1464654d0e3e46198ab49244bf31d5b6b7a77079e850cf4c97ff1472e570dfc1"} Dec 03 07:06:29 crc kubenswrapper[4475]: I1203 07:06:29.759975 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"3e6ae2a6419a9cebd1f1dffa9cdf8c8a5acdb2bcb9ceb927f45c4e93564f5359"} Dec 03 07:06:29 crc kubenswrapper[4475]: I1203 07:06:29.759996 4475 scope.go:117] "RemoveContainer" containerID="001eb8a40dd541fdfa62c93940e55ef947928ce582f2778a9f17df66253e35b4" Dec 03 07:07:27 crc kubenswrapper[4475]: I1203 07:07:27.292753 4475 scope.go:117] "RemoveContainer" containerID="b829e8bae3ecfb33d0039b80148cb037437b695ce94d68cbd40f21e1a3e56779" Dec 03 07:08:27 crc kubenswrapper[4475]: I1203 07:08:27.364601 4475 scope.go:117] "RemoveContainer" containerID="2ed38d76695ea0fe62cd693f8f4977e088e10bb5abd97b20bb2c778f25c5dcec" Dec 03 07:08:33 crc kubenswrapper[4475]: I1203 07:08:33.573231 4475 generic.go:334] "Generic (PLEG): container finished" podID="a68effcf-167a-423d-9998-3f0cbc1f36a9" containerID="64cd99427d1bfb8085729b95f49242283542467318673b56ae5caf697fa931db" exitCode=0 Dec 03 07:08:33 crc kubenswrapper[4475]: I1203 07:08:33.573572 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" event={"ID":"a68effcf-167a-423d-9998-3f0cbc1f36a9","Type":"ContainerDied","Data":"64cd99427d1bfb8085729b95f49242283542467318673b56ae5caf697fa931db"} Dec 03 07:08:34 crc kubenswrapper[4475]: I1203 07:08:34.896779 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.091118 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-bootstrap-combined-ca-bundle\") pod \"a68effcf-167a-423d-9998-3f0cbc1f36a9\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.091320 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-ssh-key\") pod \"a68effcf-167a-423d-9998-3f0cbc1f36a9\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.091356 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-inventory\") pod \"a68effcf-167a-423d-9998-3f0cbc1f36a9\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.091430 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlwfx\" (UniqueName: \"kubernetes.io/projected/a68effcf-167a-423d-9998-3f0cbc1f36a9-kube-api-access-rlwfx\") pod \"a68effcf-167a-423d-9998-3f0cbc1f36a9\" (UID: \"a68effcf-167a-423d-9998-3f0cbc1f36a9\") " Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.096239 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a68effcf-167a-423d-9998-3f0cbc1f36a9" (UID: "a68effcf-167a-423d-9998-3f0cbc1f36a9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.097036 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a68effcf-167a-423d-9998-3f0cbc1f36a9-kube-api-access-rlwfx" (OuterVolumeSpecName: "kube-api-access-rlwfx") pod "a68effcf-167a-423d-9998-3f0cbc1f36a9" (UID: "a68effcf-167a-423d-9998-3f0cbc1f36a9"). InnerVolumeSpecName "kube-api-access-rlwfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.114541 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-inventory" (OuterVolumeSpecName: "inventory") pod "a68effcf-167a-423d-9998-3f0cbc1f36a9" (UID: "a68effcf-167a-423d-9998-3f0cbc1f36a9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.116823 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a68effcf-167a-423d-9998-3f0cbc1f36a9" (UID: "a68effcf-167a-423d-9998-3f0cbc1f36a9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.193138 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.193161 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.193171 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlwfx\" (UniqueName: \"kubernetes.io/projected/a68effcf-167a-423d-9998-3f0cbc1f36a9-kube-api-access-rlwfx\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.193180 4475 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68effcf-167a-423d-9998-3f0cbc1f36a9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.587091 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" event={"ID":"a68effcf-167a-423d-9998-3f0cbc1f36a9","Type":"ContainerDied","Data":"bfbe5b6cde0ec893823b81b35476c945a319886cdee27282ea3011b7843a30b3"} Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.587126 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfbe5b6cde0ec893823b81b35476c945a319886cdee27282ea3011b7843a30b3" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.587169 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8x2hs" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.653826 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq"] Dec 03 07:08:35 crc kubenswrapper[4475]: E1203 07:08:35.654360 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68effcf-167a-423d-9998-3f0cbc1f36a9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.654378 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68effcf-167a-423d-9998-3f0cbc1f36a9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.654600 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68effcf-167a-423d-9998-3f0cbc1f36a9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.656166 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.666732 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.666931 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.667417 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.668744 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.675800 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq"] Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.701110 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.701286 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.701329 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrkdj\" (UniqueName: \"kubernetes.io/projected/ef8827bc-648a-41fe-a322-bd2c85e0d535-kube-api-access-mrkdj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.802405 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.802553 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.802590 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrkdj\" (UniqueName: \"kubernetes.io/projected/ef8827bc-648a-41fe-a322-bd2c85e0d535-kube-api-access-mrkdj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.809002 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.809889 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.816150 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrkdj\" (UniqueName: \"kubernetes.io/projected/ef8827bc-648a-41fe-a322-bd2c85e0d535-kube-api-access-mrkdj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:08:35 crc kubenswrapper[4475]: I1203 07:08:35.975498 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:08:36 crc kubenswrapper[4475]: I1203 07:08:36.418208 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq"] Dec 03 07:08:36 crc kubenswrapper[4475]: I1203 07:08:36.596216 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" event={"ID":"ef8827bc-648a-41fe-a322-bd2c85e0d535","Type":"ContainerStarted","Data":"bd1671e719171ebe9632b55dc25e9ca6225897ee63123d42cbae420efa316dae"} Dec 03 07:08:37 crc kubenswrapper[4475]: I1203 07:08:37.604491 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" event={"ID":"ef8827bc-648a-41fe-a322-bd2c85e0d535","Type":"ContainerStarted","Data":"ef420b60343afef57d2e6ae316142037f3753bbc511a1bbeec8d8cb22529a434"} Dec 03 07:08:37 crc kubenswrapper[4475]: I1203 07:08:37.621547 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" podStartSLOduration=2.093661348 podStartE2EDuration="2.621532419s" podCreationTimestamp="2025-12-03 07:08:35 +0000 UTC" firstStartedPulling="2025-12-03 07:08:36.426313796 +0000 UTC m=+1401.231212120" lastFinishedPulling="2025-12-03 07:08:36.954184856 +0000 UTC m=+1401.759083191" observedRunningTime="2025-12-03 07:08:37.615618436 +0000 UTC m=+1402.420516770" watchObservedRunningTime="2025-12-03 07:08:37.621532419 +0000 UTC m=+1402.426430753" Dec 03 07:08:58 crc kubenswrapper[4475]: I1203 07:08:58.933330 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:08:58 crc kubenswrapper[4475]: I1203 07:08:58.933744 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:09:03 crc kubenswrapper[4475]: I1203 07:09:03.036714 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b67d-account-create-update-xx754"] Dec 03 07:09:03 crc kubenswrapper[4475]: I1203 07:09:03.043184 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-nvtnr"] Dec 03 07:09:03 crc kubenswrapper[4475]: I1203 07:09:03.050182 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-nvtnr"] Dec 03 07:09:03 crc kubenswrapper[4475]: I1203 07:09:03.055760 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b67d-account-create-update-xx754"] Dec 03 07:09:03 crc kubenswrapper[4475]: I1203 07:09:03.500231 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45c3aef8-81fe-4320-afbd-c83eb00e861a" path="/var/lib/kubelet/pods/45c3aef8-81fe-4320-afbd-c83eb00e861a/volumes" Dec 03 07:09:03 crc kubenswrapper[4475]: I1203 07:09:03.502370 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95381607-f11e-478e-b1fc-7ea77035f03f" path="/var/lib/kubelet/pods/95381607-f11e-478e-b1fc-7ea77035f03f/volumes" Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.029640 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-5b8fz"] Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.042833 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3d6e-account-create-update-gtfgv"] Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.052132 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-33d5-account-create-update-h4hzd"] Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.058674 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6dfhs"] Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.065215 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-5b8fz"] Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.071177 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3d6e-account-create-update-gtfgv"] Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.078727 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6dfhs"] Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.085162 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-33d5-account-create-update-h4hzd"] Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.500486 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b649c00-045b-474a-aa22-71e06f5d454f" path="/var/lib/kubelet/pods/1b649c00-045b-474a-aa22-71e06f5d454f/volumes" Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.501694 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9276d039-20eb-4651-b987-97f393cbc59a" path="/var/lib/kubelet/pods/9276d039-20eb-4651-b987-97f393cbc59a/volumes" Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.503755 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e76aadcc-0222-442c-8c9a-d0b197c92978" path="/var/lib/kubelet/pods/e76aadcc-0222-442c-8c9a-d0b197c92978/volumes" Dec 03 07:09:07 crc kubenswrapper[4475]: I1203 07:09:07.505571 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f131782f-4d8b-48fb-9eff-1a2c1f7b859b" path="/var/lib/kubelet/pods/f131782f-4d8b-48fb-9eff-1a2c1f7b859b/volumes" Dec 03 07:09:27 crc kubenswrapper[4475]: I1203 07:09:27.403781 4475 scope.go:117] "RemoveContainer" containerID="3a98eb016a766e7141bf0c937f80c29768bca8ce334972bcb076f1ee0d04a8c5" Dec 03 07:09:27 crc kubenswrapper[4475]: I1203 07:09:27.427169 4475 scope.go:117] "RemoveContainer" containerID="a3c176e5a040dee9ac021ab10fa5deec4f496ebd9e227bcb62c68a7ef95d6624" Dec 03 07:09:27 crc kubenswrapper[4475]: I1203 07:09:27.459928 4475 scope.go:117] "RemoveContainer" containerID="78e434cfebb1c0da4cc0193ccaced0ca0fea26f9446299013209707b8e70b844" Dec 03 07:09:27 crc kubenswrapper[4475]: I1203 07:09:27.488578 4475 scope.go:117] "RemoveContainer" containerID="7504efc45e27104247e8090104bccba3a0eb453488cfda69abdd899d8b13c4bc" Dec 03 07:09:27 crc kubenswrapper[4475]: I1203 07:09:27.522273 4475 scope.go:117] "RemoveContainer" containerID="0b635505b253c154fad2b8e4a664f2f5302acc860e0c6b6459fbb74f33bc8bdb" Dec 03 07:09:27 crc kubenswrapper[4475]: I1203 07:09:27.550951 4475 scope.go:117] "RemoveContainer" containerID="7d8962a7a160f903125c5e0d0964f3c4f8df7cc02ae8230371b00a875a168f36" Dec 03 07:09:27 crc kubenswrapper[4475]: I1203 07:09:27.581676 4475 scope.go:117] "RemoveContainer" containerID="6e7a6fc7215fff9a2ce501076749a575f1b754686798d3d087892b7aa8a70f01" Dec 03 07:09:27 crc kubenswrapper[4475]: I1203 07:09:27.597339 4475 scope.go:117] "RemoveContainer" containerID="215c348b1f650cdac200c94f2e35071fba83509aa1b351f56ccc2e99fefc45d8" Dec 03 07:09:28 crc kubenswrapper[4475]: I1203 07:09:28.932871 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:09:28 crc kubenswrapper[4475]: I1203 07:09:28.932914 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:09:33 crc kubenswrapper[4475]: I1203 07:09:33.037244 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-psvbz"] Dec 03 07:09:33 crc kubenswrapper[4475]: I1203 07:09:33.048605 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-psvbz"] Dec 03 07:09:33 crc kubenswrapper[4475]: I1203 07:09:33.501017 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e136d3d-bbe5-44b0-ad48-4e560507aeac" path="/var/lib/kubelet/pods/3e136d3d-bbe5-44b0-ad48-4e560507aeac/volumes" Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.025368 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-zdkhj"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.032593 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-vtkm6"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.039054 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-df9c-account-create-update-hsg55"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.044384 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-zdkhj"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.049596 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-vtkm6"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.054739 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-df9c-account-create-update-hsg55"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.059825 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-nkqhw"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.065021 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9a21-account-create-update-jxlth"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.070030 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-nkqhw"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.075124 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9a21-account-create-update-jxlth"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.080326 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qqg9j"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.085428 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-1b56-account-create-update-t9h78"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.090575 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qqg9j"] Dec 03 07:09:36 crc kubenswrapper[4475]: I1203 07:09:36.095404 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-1b56-account-create-update-t9h78"] Dec 03 07:09:37 crc kubenswrapper[4475]: I1203 07:09:37.020465 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c2f3-account-create-update-r8pw9"] Dec 03 07:09:37 crc kubenswrapper[4475]: I1203 07:09:37.028598 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c2f3-account-create-update-r8pw9"] Dec 03 07:09:37 crc kubenswrapper[4475]: I1203 07:09:37.499466 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5045b615-e537-4115-9af6-764a3969ac1b" path="/var/lib/kubelet/pods/5045b615-e537-4115-9af6-764a3969ac1b/volumes" Dec 03 07:09:37 crc kubenswrapper[4475]: I1203 07:09:37.501614 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f4dac41-4d1c-4ad1-a5ee-802ce88143d3" path="/var/lib/kubelet/pods/5f4dac41-4d1c-4ad1-a5ee-802ce88143d3/volumes" Dec 03 07:09:37 crc kubenswrapper[4475]: I1203 07:09:37.503536 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a4c017b-d333-4421-be59-552865c2b025" path="/var/lib/kubelet/pods/6a4c017b-d333-4421-be59-552865c2b025/volumes" Dec 03 07:09:37 crc kubenswrapper[4475]: I1203 07:09:37.504949 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790b2af7-c661-4e6e-9579-f338835ff45a" path="/var/lib/kubelet/pods/790b2af7-c661-4e6e-9579-f338835ff45a/volumes" Dec 03 07:09:37 crc kubenswrapper[4475]: I1203 07:09:37.506524 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c92b6e31-02a9-4c39-8505-d8c3a9224862" path="/var/lib/kubelet/pods/c92b6e31-02a9-4c39-8505-d8c3a9224862/volumes" Dec 03 07:09:37 crc kubenswrapper[4475]: I1203 07:09:37.508051 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4684c84-5da0-44d3-a47e-0cd3e2cba943" path="/var/lib/kubelet/pods/e4684c84-5da0-44d3-a47e-0cd3e2cba943/volumes" Dec 03 07:09:37 crc kubenswrapper[4475]: I1203 07:09:37.509546 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f50cb109-7030-4a9a-9401-78f0296c1d4e" path="/var/lib/kubelet/pods/f50cb109-7030-4a9a-9401-78f0296c1d4e/volumes" Dec 03 07:09:37 crc kubenswrapper[4475]: I1203 07:09:37.511532 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5915d05-922b-4df4-b6da-beadb7537e57" path="/var/lib/kubelet/pods/f5915d05-922b-4df4-b6da-beadb7537e57/volumes" Dec 03 07:09:44 crc kubenswrapper[4475]: I1203 07:09:44.032057 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qvcvf"] Dec 03 07:09:44 crc kubenswrapper[4475]: I1203 07:09:44.038093 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qvcvf"] Dec 03 07:09:45 crc kubenswrapper[4475]: I1203 07:09:45.500610 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ab43dfd-1c80-4922-aab7-93dc3d3b7d27" path="/var/lib/kubelet/pods/6ab43dfd-1c80-4922-aab7-93dc3d3b7d27/volumes" Dec 03 07:09:58 crc kubenswrapper[4475]: I1203 07:09:58.933365 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:09:58 crc kubenswrapper[4475]: I1203 07:09:58.933790 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:09:58 crc kubenswrapper[4475]: I1203 07:09:58.933829 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 07:09:58 crc kubenswrapper[4475]: I1203 07:09:58.934424 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e6ae2a6419a9cebd1f1dffa9cdf8c8a5acdb2bcb9ceb927f45c4e93564f5359"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:09:58 crc kubenswrapper[4475]: I1203 07:09:58.934492 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://3e6ae2a6419a9cebd1f1dffa9cdf8c8a5acdb2bcb9ceb927f45c4e93564f5359" gracePeriod=600 Dec 03 07:09:59 crc kubenswrapper[4475]: I1203 07:09:59.140333 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="3e6ae2a6419a9cebd1f1dffa9cdf8c8a5acdb2bcb9ceb927f45c4e93564f5359" exitCode=0 Dec 03 07:09:59 crc kubenswrapper[4475]: I1203 07:09:59.140377 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"3e6ae2a6419a9cebd1f1dffa9cdf8c8a5acdb2bcb9ceb927f45c4e93564f5359"} Dec 03 07:09:59 crc kubenswrapper[4475]: I1203 07:09:59.140440 4475 scope.go:117] "RemoveContainer" containerID="1464654d0e3e46198ab49244bf31d5b6b7a77079e850cf4c97ff1472e570dfc1" Dec 03 07:10:00 crc kubenswrapper[4475]: I1203 07:10:00.149081 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104"} Dec 03 07:10:04 crc kubenswrapper[4475]: I1203 07:10:04.022240 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-hk2ck"] Dec 03 07:10:04 crc kubenswrapper[4475]: I1203 07:10:04.028254 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-hk2ck"] Dec 03 07:10:05 crc kubenswrapper[4475]: I1203 07:10:05.500722 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a" path="/var/lib/kubelet/pods/26e8d9b9-9ab9-428f-9b0c-78a50bd71e7a/volumes" Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.581643 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mmkfj"] Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.583573 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.590728 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmkfj"] Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.744760 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-utilities\") pod \"redhat-marketplace-mmkfj\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.744986 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5bjf\" (UniqueName: \"kubernetes.io/projected/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-kube-api-access-n5bjf\") pod \"redhat-marketplace-mmkfj\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.745128 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-catalog-content\") pod \"redhat-marketplace-mmkfj\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.846929 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-catalog-content\") pod \"redhat-marketplace-mmkfj\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.847061 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-utilities\") pod \"redhat-marketplace-mmkfj\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.847103 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5bjf\" (UniqueName: \"kubernetes.io/projected/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-kube-api-access-n5bjf\") pod \"redhat-marketplace-mmkfj\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.847330 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-catalog-content\") pod \"redhat-marketplace-mmkfj\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.847517 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-utilities\") pod \"redhat-marketplace-mmkfj\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.866370 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5bjf\" (UniqueName: \"kubernetes.io/projected/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-kube-api-access-n5bjf\") pod \"redhat-marketplace-mmkfj\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:16 crc kubenswrapper[4475]: I1203 07:10:16.896410 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:17 crc kubenswrapper[4475]: I1203 07:10:17.360236 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmkfj"] Dec 03 07:10:18 crc kubenswrapper[4475]: I1203 07:10:18.267013 4475 generic.go:334] "Generic (PLEG): container finished" podID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" containerID="c39ba853b58710607d3d233ed611ef2573ab1c29c14ee3164b3a031c41e9c53d" exitCode=0 Dec 03 07:10:18 crc kubenswrapper[4475]: I1203 07:10:18.267051 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmkfj" event={"ID":"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4","Type":"ContainerDied","Data":"c39ba853b58710607d3d233ed611ef2573ab1c29c14ee3164b3a031c41e9c53d"} Dec 03 07:10:18 crc kubenswrapper[4475]: I1203 07:10:18.267207 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmkfj" event={"ID":"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4","Type":"ContainerStarted","Data":"3e91f6e0003b1f71dd42306e64dc147ef8890411c4e2c30ace43e76fd29349ec"} Dec 03 07:10:18 crc kubenswrapper[4475]: I1203 07:10:18.268944 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:10:20 crc kubenswrapper[4475]: I1203 07:10:20.283586 4475 generic.go:334] "Generic (PLEG): container finished" podID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" containerID="ed6e1e3211047ba70e4bf2a2eda4f7e2d3340c1a8231a0b3b2071a27529a47c1" exitCode=0 Dec 03 07:10:20 crc kubenswrapper[4475]: I1203 07:10:20.283619 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmkfj" event={"ID":"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4","Type":"ContainerDied","Data":"ed6e1e3211047ba70e4bf2a2eda4f7e2d3340c1a8231a0b3b2071a27529a47c1"} Dec 03 07:10:21 crc kubenswrapper[4475]: I1203 07:10:21.292702 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmkfj" event={"ID":"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4","Type":"ContainerStarted","Data":"54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04"} Dec 03 07:10:21 crc kubenswrapper[4475]: I1203 07:10:21.308728 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mmkfj" podStartSLOduration=2.7520085119999997 podStartE2EDuration="5.30871288s" podCreationTimestamp="2025-12-03 07:10:16 +0000 UTC" firstStartedPulling="2025-12-03 07:10:18.268663595 +0000 UTC m=+1503.073561930" lastFinishedPulling="2025-12-03 07:10:20.825367964 +0000 UTC m=+1505.630266298" observedRunningTime="2025-12-03 07:10:21.307906064 +0000 UTC m=+1506.112804398" watchObservedRunningTime="2025-12-03 07:10:21.30871288 +0000 UTC m=+1506.113611214" Dec 03 07:10:26 crc kubenswrapper[4475]: I1203 07:10:26.897489 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:26 crc kubenswrapper[4475]: I1203 07:10:26.897893 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:26 crc kubenswrapper[4475]: I1203 07:10:26.941184 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.367722 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.404876 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmkfj"] Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.702603 4475 scope.go:117] "RemoveContainer" containerID="6e4bffe4be48739788a9819685203738ee99fe18002f69aefa2e25639ad8edf3" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.721707 4475 scope.go:117] "RemoveContainer" containerID="9a183ceea336a168b469e2f45b5e8914344ea1fd13ea594f0f0b576d57f8457c" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.758147 4475 scope.go:117] "RemoveContainer" containerID="bb62d6c562cddec704fa5803e771661ac36eadf00ca764823d7d39fc6b47aa5e" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.792007 4475 scope.go:117] "RemoveContainer" containerID="69b72a6166c127d3fabe2dd5a580369f0ef9a9aae7532f8873500d08d4b089b0" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.824123 4475 scope.go:117] "RemoveContainer" containerID="94fa3e8d211ebc4a7c67b8c8b7a3e9967b0c98327ed8147162979b3a8935e113" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.862984 4475 scope.go:117] "RemoveContainer" containerID="65e7abf5c94ef3ae2ade2a98b65eb05e19378e09ecd86c0233a0cc46629fb33f" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.893363 4475 scope.go:117] "RemoveContainer" containerID="90100f7a1425b499192d4527a35a5d3de7bd153303b05fb114235c18870b6a98" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.908634 4475 scope.go:117] "RemoveContainer" containerID="4710f6a47201f53eb06128e76c9a9e68c7031c7d594239172a8e1090b4ec8184" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.921809 4475 scope.go:117] "RemoveContainer" containerID="f252b0908f155e49923598575a58960b2eb53adf2a97b6290a841f2180c1da50" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.937970 4475 scope.go:117] "RemoveContainer" containerID="ae3b4b724cb3f2d7bb3629dc55c6a6fe81ddcd07954065b00cf111595fe2b905" Dec 03 07:10:27 crc kubenswrapper[4475]: I1203 07:10:27.953553 4475 scope.go:117] "RemoveContainer" containerID="409ac52d52ca48240fcbb42875bd144f05939dd9a77cc52eee561d5b28656b00" Dec 03 07:10:29 crc kubenswrapper[4475]: I1203 07:10:29.350206 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mmkfj" podUID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" containerName="registry-server" containerID="cri-o://54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04" gracePeriod=2 Dec 03 07:10:29 crc kubenswrapper[4475]: I1203 07:10:29.779372 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:29 crc kubenswrapper[4475]: I1203 07:10:29.888036 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-catalog-content\") pod \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " Dec 03 07:10:29 crc kubenswrapper[4475]: I1203 07:10:29.888137 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5bjf\" (UniqueName: \"kubernetes.io/projected/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-kube-api-access-n5bjf\") pod \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " Dec 03 07:10:29 crc kubenswrapper[4475]: I1203 07:10:29.888288 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-utilities\") pod \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\" (UID: \"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4\") " Dec 03 07:10:29 crc kubenswrapper[4475]: I1203 07:10:29.889089 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-utilities" (OuterVolumeSpecName: "utilities") pod "e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" (UID: "e3c123e0-7b2e-4710-a0d0-1cafe80a50c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:10:29 crc kubenswrapper[4475]: I1203 07:10:29.901072 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" (UID: "e3c123e0-7b2e-4710-a0d0-1cafe80a50c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:10:29 crc kubenswrapper[4475]: I1203 07:10:29.902647 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-kube-api-access-n5bjf" (OuterVolumeSpecName: "kube-api-access-n5bjf") pod "e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" (UID: "e3c123e0-7b2e-4710-a0d0-1cafe80a50c4"). InnerVolumeSpecName "kube-api-access-n5bjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:10:29 crc kubenswrapper[4475]: I1203 07:10:29.990536 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:29 crc kubenswrapper[4475]: I1203 07:10:29.990558 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:29 crc kubenswrapper[4475]: I1203 07:10:29.990569 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5bjf\" (UniqueName: \"kubernetes.io/projected/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4-kube-api-access-n5bjf\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.358728 4475 generic.go:334] "Generic (PLEG): container finished" podID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" containerID="54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04" exitCode=0 Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.358762 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmkfj" event={"ID":"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4","Type":"ContainerDied","Data":"54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04"} Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.358770 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mmkfj" Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.358783 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmkfj" event={"ID":"e3c123e0-7b2e-4710-a0d0-1cafe80a50c4","Type":"ContainerDied","Data":"3e91f6e0003b1f71dd42306e64dc147ef8890411c4e2c30ace43e76fd29349ec"} Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.358800 4475 scope.go:117] "RemoveContainer" containerID="54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04" Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.382598 4475 scope.go:117] "RemoveContainer" containerID="ed6e1e3211047ba70e4bf2a2eda4f7e2d3340c1a8231a0b3b2071a27529a47c1" Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.386548 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmkfj"] Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.392769 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmkfj"] Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.401501 4475 scope.go:117] "RemoveContainer" containerID="c39ba853b58710607d3d233ed611ef2573ab1c29c14ee3164b3a031c41e9c53d" Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.435996 4475 scope.go:117] "RemoveContainer" containerID="54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04" Dec 03 07:10:30 crc kubenswrapper[4475]: E1203 07:10:30.436392 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04\": container with ID starting with 54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04 not found: ID does not exist" containerID="54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04" Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.436426 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04"} err="failed to get container status \"54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04\": rpc error: code = NotFound desc = could not find container \"54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04\": container with ID starting with 54bd45e9098e25b56e5c174dd4a4e28b4006dcf188222ee32cd151baeb852b04 not found: ID does not exist" Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.436443 4475 scope.go:117] "RemoveContainer" containerID="ed6e1e3211047ba70e4bf2a2eda4f7e2d3340c1a8231a0b3b2071a27529a47c1" Dec 03 07:10:30 crc kubenswrapper[4475]: E1203 07:10:30.436688 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed6e1e3211047ba70e4bf2a2eda4f7e2d3340c1a8231a0b3b2071a27529a47c1\": container with ID starting with ed6e1e3211047ba70e4bf2a2eda4f7e2d3340c1a8231a0b3b2071a27529a47c1 not found: ID does not exist" containerID="ed6e1e3211047ba70e4bf2a2eda4f7e2d3340c1a8231a0b3b2071a27529a47c1" Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.436793 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed6e1e3211047ba70e4bf2a2eda4f7e2d3340c1a8231a0b3b2071a27529a47c1"} err="failed to get container status \"ed6e1e3211047ba70e4bf2a2eda4f7e2d3340c1a8231a0b3b2071a27529a47c1\": rpc error: code = NotFound desc = could not find container \"ed6e1e3211047ba70e4bf2a2eda4f7e2d3340c1a8231a0b3b2071a27529a47c1\": container with ID starting with ed6e1e3211047ba70e4bf2a2eda4f7e2d3340c1a8231a0b3b2071a27529a47c1 not found: ID does not exist" Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.436866 4475 scope.go:117] "RemoveContainer" containerID="c39ba853b58710607d3d233ed611ef2573ab1c29c14ee3164b3a031c41e9c53d" Dec 03 07:10:30 crc kubenswrapper[4475]: E1203 07:10:30.437179 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c39ba853b58710607d3d233ed611ef2573ab1c29c14ee3164b3a031c41e9c53d\": container with ID starting with c39ba853b58710607d3d233ed611ef2573ab1c29c14ee3164b3a031c41e9c53d not found: ID does not exist" containerID="c39ba853b58710607d3d233ed611ef2573ab1c29c14ee3164b3a031c41e9c53d" Dec 03 07:10:30 crc kubenswrapper[4475]: I1203 07:10:30.437205 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c39ba853b58710607d3d233ed611ef2573ab1c29c14ee3164b3a031c41e9c53d"} err="failed to get container status \"c39ba853b58710607d3d233ed611ef2573ab1c29c14ee3164b3a031c41e9c53d\": rpc error: code = NotFound desc = could not find container \"c39ba853b58710607d3d233ed611ef2573ab1c29c14ee3164b3a031c41e9c53d\": container with ID starting with c39ba853b58710607d3d233ed611ef2573ab1c29c14ee3164b3a031c41e9c53d not found: ID does not exist" Dec 03 07:10:31 crc kubenswrapper[4475]: I1203 07:10:31.498737 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" path="/var/lib/kubelet/pods/e3c123e0-7b2e-4710-a0d0-1cafe80a50c4/volumes" Dec 03 07:10:33 crc kubenswrapper[4475]: I1203 07:10:33.031195 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-65f94"] Dec 03 07:10:33 crc kubenswrapper[4475]: I1203 07:10:33.036895 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-65f94"] Dec 03 07:10:33 crc kubenswrapper[4475]: I1203 07:10:33.042135 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-pqss2"] Dec 03 07:10:33 crc kubenswrapper[4475]: I1203 07:10:33.047896 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-pqss2"] Dec 03 07:10:33 crc kubenswrapper[4475]: I1203 07:10:33.499611 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18fecb56-f151-4f5a-aac3-30785def9653" path="/var/lib/kubelet/pods/18fecb56-f151-4f5a-aac3-30785def9653/volumes" Dec 03 07:10:33 crc kubenswrapper[4475]: I1203 07:10:33.501121 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50149f3f-08c3-4fd9-9590-b13fcd787897" path="/var/lib/kubelet/pods/50149f3f-08c3-4fd9-9590-b13fcd787897/volumes" Dec 03 07:10:40 crc kubenswrapper[4475]: I1203 07:10:40.949664 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kbvk5"] Dec 03 07:10:40 crc kubenswrapper[4475]: E1203 07:10:40.950329 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" containerName="extract-content" Dec 03 07:10:40 crc kubenswrapper[4475]: I1203 07:10:40.950341 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" containerName="extract-content" Dec 03 07:10:40 crc kubenswrapper[4475]: E1203 07:10:40.950355 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" containerName="registry-server" Dec 03 07:10:40 crc kubenswrapper[4475]: I1203 07:10:40.950361 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" containerName="registry-server" Dec 03 07:10:40 crc kubenswrapper[4475]: E1203 07:10:40.950390 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" containerName="extract-utilities" Dec 03 07:10:40 crc kubenswrapper[4475]: I1203 07:10:40.950397 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" containerName="extract-utilities" Dec 03 07:10:40 crc kubenswrapper[4475]: I1203 07:10:40.950584 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c123e0-7b2e-4710-a0d0-1cafe80a50c4" containerName="registry-server" Dec 03 07:10:40 crc kubenswrapper[4475]: I1203 07:10:40.951724 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:40 crc kubenswrapper[4475]: I1203 07:10:40.961214 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kbvk5"] Dec 03 07:10:41 crc kubenswrapper[4475]: I1203 07:10:41.060190 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-utilities\") pod \"certified-operators-kbvk5\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:41 crc kubenswrapper[4475]: I1203 07:10:41.060470 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjdbl\" (UniqueName: \"kubernetes.io/projected/facc1808-e75e-4c5e-917b-54ed5965a68c-kube-api-access-qjdbl\") pod \"certified-operators-kbvk5\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:41 crc kubenswrapper[4475]: I1203 07:10:41.060584 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-catalog-content\") pod \"certified-operators-kbvk5\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:41 crc kubenswrapper[4475]: I1203 07:10:41.162642 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-utilities\") pod \"certified-operators-kbvk5\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:41 crc kubenswrapper[4475]: I1203 07:10:41.162764 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjdbl\" (UniqueName: \"kubernetes.io/projected/facc1808-e75e-4c5e-917b-54ed5965a68c-kube-api-access-qjdbl\") pod \"certified-operators-kbvk5\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:41 crc kubenswrapper[4475]: I1203 07:10:41.162796 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-catalog-content\") pod \"certified-operators-kbvk5\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:41 crc kubenswrapper[4475]: I1203 07:10:41.163546 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-catalog-content\") pod \"certified-operators-kbvk5\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:41 crc kubenswrapper[4475]: I1203 07:10:41.163543 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-utilities\") pod \"certified-operators-kbvk5\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:41 crc kubenswrapper[4475]: I1203 07:10:41.180244 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjdbl\" (UniqueName: \"kubernetes.io/projected/facc1808-e75e-4c5e-917b-54ed5965a68c-kube-api-access-qjdbl\") pod \"certified-operators-kbvk5\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:41 crc kubenswrapper[4475]: I1203 07:10:41.266576 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:41 crc kubenswrapper[4475]: I1203 07:10:41.730103 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kbvk5"] Dec 03 07:10:42 crc kubenswrapper[4475]: I1203 07:10:42.445444 4475 generic.go:334] "Generic (PLEG): container finished" podID="facc1808-e75e-4c5e-917b-54ed5965a68c" containerID="4c37d7ffa88de4b1cd6a4f97ad5d40f21b4902677018042c424989cfa911c6fe" exitCode=0 Dec 03 07:10:42 crc kubenswrapper[4475]: I1203 07:10:42.445547 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbvk5" event={"ID":"facc1808-e75e-4c5e-917b-54ed5965a68c","Type":"ContainerDied","Data":"4c37d7ffa88de4b1cd6a4f97ad5d40f21b4902677018042c424989cfa911c6fe"} Dec 03 07:10:42 crc kubenswrapper[4475]: I1203 07:10:42.445689 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbvk5" event={"ID":"facc1808-e75e-4c5e-917b-54ed5965a68c","Type":"ContainerStarted","Data":"16776d492023dacb9291c0f2f7de0e31c25f51a22d317792b8c817b731e3812a"} Dec 03 07:10:43 crc kubenswrapper[4475]: I1203 07:10:43.453368 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbvk5" event={"ID":"facc1808-e75e-4c5e-917b-54ed5965a68c","Type":"ContainerStarted","Data":"667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde"} Dec 03 07:10:44 crc kubenswrapper[4475]: I1203 07:10:44.460621 4475 generic.go:334] "Generic (PLEG): container finished" podID="facc1808-e75e-4c5e-917b-54ed5965a68c" containerID="667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde" exitCode=0 Dec 03 07:10:44 crc kubenswrapper[4475]: I1203 07:10:44.460877 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbvk5" event={"ID":"facc1808-e75e-4c5e-917b-54ed5965a68c","Type":"ContainerDied","Data":"667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde"} Dec 03 07:10:45 crc kubenswrapper[4475]: I1203 07:10:45.468159 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbvk5" event={"ID":"facc1808-e75e-4c5e-917b-54ed5965a68c","Type":"ContainerStarted","Data":"2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed"} Dec 03 07:10:45 crc kubenswrapper[4475]: I1203 07:10:45.483989 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kbvk5" podStartSLOduration=2.955850154 podStartE2EDuration="5.483974743s" podCreationTimestamp="2025-12-03 07:10:40 +0000 UTC" firstStartedPulling="2025-12-03 07:10:42.448394535 +0000 UTC m=+1527.253292868" lastFinishedPulling="2025-12-03 07:10:44.976519123 +0000 UTC m=+1529.781417457" observedRunningTime="2025-12-03 07:10:45.481434611 +0000 UTC m=+1530.286332955" watchObservedRunningTime="2025-12-03 07:10:45.483974743 +0000 UTC m=+1530.288873077" Dec 03 07:10:46 crc kubenswrapper[4475]: I1203 07:10:46.032022 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6ntz5"] Dec 03 07:10:46 crc kubenswrapper[4475]: I1203 07:10:46.037583 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-zch8h"] Dec 03 07:10:46 crc kubenswrapper[4475]: I1203 07:10:46.042630 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-bxx54"] Dec 03 07:10:46 crc kubenswrapper[4475]: I1203 07:10:46.048089 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6ntz5"] Dec 03 07:10:46 crc kubenswrapper[4475]: I1203 07:10:46.052935 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-zch8h"] Dec 03 07:10:46 crc kubenswrapper[4475]: I1203 07:10:46.057739 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-bxx54"] Dec 03 07:10:47 crc kubenswrapper[4475]: I1203 07:10:47.499445 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a298c73-a9bf-496a-9192-dcbf3e2417cd" path="/var/lib/kubelet/pods/2a298c73-a9bf-496a-9192-dcbf3e2417cd/volumes" Dec 03 07:10:47 crc kubenswrapper[4475]: I1203 07:10:47.500344 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7105b12e-7df5-42e5-b0cc-27ea52ea7b1c" path="/var/lib/kubelet/pods/7105b12e-7df5-42e5-b0cc-27ea52ea7b1c/volumes" Dec 03 07:10:47 crc kubenswrapper[4475]: I1203 07:10:47.501377 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7df369-c49c-4d2a-842a-a8bd41944f1b" path="/var/lib/kubelet/pods/8c7df369-c49c-4d2a-842a-a8bd41944f1b/volumes" Dec 03 07:10:51 crc kubenswrapper[4475]: I1203 07:10:51.267095 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:51 crc kubenswrapper[4475]: I1203 07:10:51.267442 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:51 crc kubenswrapper[4475]: I1203 07:10:51.297918 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:51 crc kubenswrapper[4475]: I1203 07:10:51.535961 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:51 crc kubenswrapper[4475]: I1203 07:10:51.569989 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kbvk5"] Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.516258 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kbvk5" podUID="facc1808-e75e-4c5e-917b-54ed5965a68c" containerName="registry-server" containerID="cri-o://2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed" gracePeriod=2 Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.731313 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jlzfr"] Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.733111 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.750138 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jlzfr"] Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.861196 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-catalog-content\") pod \"community-operators-jlzfr\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.861278 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-utilities\") pod \"community-operators-jlzfr\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.861364 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vjth\" (UniqueName: \"kubernetes.io/projected/d27e05a2-7d4c-4acb-b31a-084675221a73-kube-api-access-6vjth\") pod \"community-operators-jlzfr\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.913429 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.963103 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-catalog-content\") pod \"community-operators-jlzfr\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.963160 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-utilities\") pod \"community-operators-jlzfr\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.963223 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vjth\" (UniqueName: \"kubernetes.io/projected/d27e05a2-7d4c-4acb-b31a-084675221a73-kube-api-access-6vjth\") pod \"community-operators-jlzfr\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.963970 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-catalog-content\") pod \"community-operators-jlzfr\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.964001 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-utilities\") pod \"community-operators-jlzfr\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:10:53 crc kubenswrapper[4475]: I1203 07:10:53.980761 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vjth\" (UniqueName: \"kubernetes.io/projected/d27e05a2-7d4c-4acb-b31a-084675221a73-kube-api-access-6vjth\") pod \"community-operators-jlzfr\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.051046 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.065340 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-utilities\") pod \"facc1808-e75e-4c5e-917b-54ed5965a68c\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.065409 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-catalog-content\") pod \"facc1808-e75e-4c5e-917b-54ed5965a68c\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.065536 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjdbl\" (UniqueName: \"kubernetes.io/projected/facc1808-e75e-4c5e-917b-54ed5965a68c-kube-api-access-qjdbl\") pod \"facc1808-e75e-4c5e-917b-54ed5965a68c\" (UID: \"facc1808-e75e-4c5e-917b-54ed5965a68c\") " Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.065878 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-utilities" (OuterVolumeSpecName: "utilities") pod "facc1808-e75e-4c5e-917b-54ed5965a68c" (UID: "facc1808-e75e-4c5e-917b-54ed5965a68c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.066031 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.068989 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/facc1808-e75e-4c5e-917b-54ed5965a68c-kube-api-access-qjdbl" (OuterVolumeSpecName: "kube-api-access-qjdbl") pod "facc1808-e75e-4c5e-917b-54ed5965a68c" (UID: "facc1808-e75e-4c5e-917b-54ed5965a68c"). InnerVolumeSpecName "kube-api-access-qjdbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.132831 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "facc1808-e75e-4c5e-917b-54ed5965a68c" (UID: "facc1808-e75e-4c5e-917b-54ed5965a68c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.167102 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/facc1808-e75e-4c5e-917b-54ed5965a68c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.167125 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjdbl\" (UniqueName: \"kubernetes.io/projected/facc1808-e75e-4c5e-917b-54ed5965a68c-kube-api-access-qjdbl\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.524363 4475 generic.go:334] "Generic (PLEG): container finished" podID="facc1808-e75e-4c5e-917b-54ed5965a68c" containerID="2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed" exitCode=0 Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.524612 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbvk5" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.527249 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbvk5" event={"ID":"facc1808-e75e-4c5e-917b-54ed5965a68c","Type":"ContainerDied","Data":"2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed"} Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.527302 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbvk5" event={"ID":"facc1808-e75e-4c5e-917b-54ed5965a68c","Type":"ContainerDied","Data":"16776d492023dacb9291c0f2f7de0e31c25f51a22d317792b8c817b731e3812a"} Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.527330 4475 scope.go:117] "RemoveContainer" containerID="2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.545498 4475 scope.go:117] "RemoveContainer" containerID="667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.554431 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jlzfr"] Dec 03 07:10:54 crc kubenswrapper[4475]: W1203 07:10:54.558679 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27e05a2_7d4c_4acb_b31a_084675221a73.slice/crio-2fe44d234206702f43afdeb290808ecf8a4f9cb5956e039f17a7dd7cff56383b WatchSource:0}: Error finding container 2fe44d234206702f43afdeb290808ecf8a4f9cb5956e039f17a7dd7cff56383b: Status 404 returned error can't find the container with id 2fe44d234206702f43afdeb290808ecf8a4f9cb5956e039f17a7dd7cff56383b Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.567183 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kbvk5"] Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.567479 4475 scope.go:117] "RemoveContainer" containerID="4c37d7ffa88de4b1cd6a4f97ad5d40f21b4902677018042c424989cfa911c6fe" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.571789 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kbvk5"] Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.588391 4475 scope.go:117] "RemoveContainer" containerID="2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed" Dec 03 07:10:54 crc kubenswrapper[4475]: E1203 07:10:54.588763 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed\": container with ID starting with 2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed not found: ID does not exist" containerID="2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.588815 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed"} err="failed to get container status \"2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed\": rpc error: code = NotFound desc = could not find container \"2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed\": container with ID starting with 2fb562f65f32c6955c4dd4dabb9aa93d347e74fe9a9f2516c4b367199a79b3ed not found: ID does not exist" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.588834 4475 scope.go:117] "RemoveContainer" containerID="667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde" Dec 03 07:10:54 crc kubenswrapper[4475]: E1203 07:10:54.589115 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde\": container with ID starting with 667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde not found: ID does not exist" containerID="667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.589139 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde"} err="failed to get container status \"667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde\": rpc error: code = NotFound desc = could not find container \"667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde\": container with ID starting with 667546bb7e7ddf83043364737e82fa79b0f83e4544502a0e53defe02c2dd5bde not found: ID does not exist" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.589213 4475 scope.go:117] "RemoveContainer" containerID="4c37d7ffa88de4b1cd6a4f97ad5d40f21b4902677018042c424989cfa911c6fe" Dec 03 07:10:54 crc kubenswrapper[4475]: E1203 07:10:54.589616 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c37d7ffa88de4b1cd6a4f97ad5d40f21b4902677018042c424989cfa911c6fe\": container with ID starting with 4c37d7ffa88de4b1cd6a4f97ad5d40f21b4902677018042c424989cfa911c6fe not found: ID does not exist" containerID="4c37d7ffa88de4b1cd6a4f97ad5d40f21b4902677018042c424989cfa911c6fe" Dec 03 07:10:54 crc kubenswrapper[4475]: I1203 07:10:54.589637 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c37d7ffa88de4b1cd6a4f97ad5d40f21b4902677018042c424989cfa911c6fe"} err="failed to get container status \"4c37d7ffa88de4b1cd6a4f97ad5d40f21b4902677018042c424989cfa911c6fe\": rpc error: code = NotFound desc = could not find container \"4c37d7ffa88de4b1cd6a4f97ad5d40f21b4902677018042c424989cfa911c6fe\": container with ID starting with 4c37d7ffa88de4b1cd6a4f97ad5d40f21b4902677018042c424989cfa911c6fe not found: ID does not exist" Dec 03 07:10:55 crc kubenswrapper[4475]: I1203 07:10:55.499945 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="facc1808-e75e-4c5e-917b-54ed5965a68c" path="/var/lib/kubelet/pods/facc1808-e75e-4c5e-917b-54ed5965a68c/volumes" Dec 03 07:10:55 crc kubenswrapper[4475]: I1203 07:10:55.532393 4475 generic.go:334] "Generic (PLEG): container finished" podID="d27e05a2-7d4c-4acb-b31a-084675221a73" containerID="748d415c922151fb1deb71a073c65199494569d60f9c5de94ec9dc2a667b13e6" exitCode=0 Dec 03 07:10:55 crc kubenswrapper[4475]: I1203 07:10:55.532418 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jlzfr" event={"ID":"d27e05a2-7d4c-4acb-b31a-084675221a73","Type":"ContainerDied","Data":"748d415c922151fb1deb71a073c65199494569d60f9c5de94ec9dc2a667b13e6"} Dec 03 07:10:55 crc kubenswrapper[4475]: I1203 07:10:55.532480 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jlzfr" event={"ID":"d27e05a2-7d4c-4acb-b31a-084675221a73","Type":"ContainerStarted","Data":"2fe44d234206702f43afdeb290808ecf8a4f9cb5956e039f17a7dd7cff56383b"} Dec 03 07:10:56 crc kubenswrapper[4475]: I1203 07:10:56.542028 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jlzfr" event={"ID":"d27e05a2-7d4c-4acb-b31a-084675221a73","Type":"ContainerStarted","Data":"d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb"} Dec 03 07:10:57 crc kubenswrapper[4475]: I1203 07:10:57.550469 4475 generic.go:334] "Generic (PLEG): container finished" podID="d27e05a2-7d4c-4acb-b31a-084675221a73" containerID="d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb" exitCode=0 Dec 03 07:10:57 crc kubenswrapper[4475]: I1203 07:10:57.550649 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jlzfr" event={"ID":"d27e05a2-7d4c-4acb-b31a-084675221a73","Type":"ContainerDied","Data":"d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb"} Dec 03 07:10:58 crc kubenswrapper[4475]: I1203 07:10:58.559135 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jlzfr" event={"ID":"d27e05a2-7d4c-4acb-b31a-084675221a73","Type":"ContainerStarted","Data":"f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde"} Dec 03 07:10:58 crc kubenswrapper[4475]: I1203 07:10:58.574492 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jlzfr" podStartSLOduration=2.835343392 podStartE2EDuration="5.574475992s" podCreationTimestamp="2025-12-03 07:10:53 +0000 UTC" firstStartedPulling="2025-12-03 07:10:55.534110542 +0000 UTC m=+1540.339008876" lastFinishedPulling="2025-12-03 07:10:58.273243142 +0000 UTC m=+1543.078141476" observedRunningTime="2025-12-03 07:10:58.570248028 +0000 UTC m=+1543.375146362" watchObservedRunningTime="2025-12-03 07:10:58.574475992 +0000 UTC m=+1543.379374326" Dec 03 07:11:04 crc kubenswrapper[4475]: I1203 07:11:04.052085 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:11:04 crc kubenswrapper[4475]: I1203 07:11:04.052446 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:11:04 crc kubenswrapper[4475]: I1203 07:11:04.084758 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:11:04 crc kubenswrapper[4475]: I1203 07:11:04.625521 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:11:04 crc kubenswrapper[4475]: I1203 07:11:04.667980 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jlzfr"] Dec 03 07:11:06 crc kubenswrapper[4475]: I1203 07:11:06.603333 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jlzfr" podUID="d27e05a2-7d4c-4acb-b31a-084675221a73" containerName="registry-server" containerID="cri-o://f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde" gracePeriod=2 Dec 03 07:11:06 crc kubenswrapper[4475]: I1203 07:11:06.961162 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.068494 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-utilities\") pod \"d27e05a2-7d4c-4acb-b31a-084675221a73\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.068639 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vjth\" (UniqueName: \"kubernetes.io/projected/d27e05a2-7d4c-4acb-b31a-084675221a73-kube-api-access-6vjth\") pod \"d27e05a2-7d4c-4acb-b31a-084675221a73\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.068707 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-catalog-content\") pod \"d27e05a2-7d4c-4acb-b31a-084675221a73\" (UID: \"d27e05a2-7d4c-4acb-b31a-084675221a73\") " Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.069120 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-utilities" (OuterVolumeSpecName: "utilities") pod "d27e05a2-7d4c-4acb-b31a-084675221a73" (UID: "d27e05a2-7d4c-4acb-b31a-084675221a73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.072526 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d27e05a2-7d4c-4acb-b31a-084675221a73-kube-api-access-6vjth" (OuterVolumeSpecName: "kube-api-access-6vjth") pod "d27e05a2-7d4c-4acb-b31a-084675221a73" (UID: "d27e05a2-7d4c-4acb-b31a-084675221a73"). InnerVolumeSpecName "kube-api-access-6vjth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.114364 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d27e05a2-7d4c-4acb-b31a-084675221a73" (UID: "d27e05a2-7d4c-4acb-b31a-084675221a73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.170786 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.170816 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vjth\" (UniqueName: \"kubernetes.io/projected/d27e05a2-7d4c-4acb-b31a-084675221a73-kube-api-access-6vjth\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.170829 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d27e05a2-7d4c-4acb-b31a-084675221a73-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.613148 4475 generic.go:334] "Generic (PLEG): container finished" podID="d27e05a2-7d4c-4acb-b31a-084675221a73" containerID="f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde" exitCode=0 Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.613343 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jlzfr" event={"ID":"d27e05a2-7d4c-4acb-b31a-084675221a73","Type":"ContainerDied","Data":"f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde"} Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.613381 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jlzfr" event={"ID":"d27e05a2-7d4c-4acb-b31a-084675221a73","Type":"ContainerDied","Data":"2fe44d234206702f43afdeb290808ecf8a4f9cb5956e039f17a7dd7cff56383b"} Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.613402 4475 scope.go:117] "RemoveContainer" containerID="f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.613403 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jlzfr" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.632619 4475 scope.go:117] "RemoveContainer" containerID="d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.633021 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jlzfr"] Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.638496 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jlzfr"] Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.651068 4475 scope.go:117] "RemoveContainer" containerID="748d415c922151fb1deb71a073c65199494569d60f9c5de94ec9dc2a667b13e6" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.676830 4475 scope.go:117] "RemoveContainer" containerID="f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde" Dec 03 07:11:07 crc kubenswrapper[4475]: E1203 07:11:07.677124 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde\": container with ID starting with f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde not found: ID does not exist" containerID="f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.677156 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde"} err="failed to get container status \"f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde\": rpc error: code = NotFound desc = could not find container \"f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde\": container with ID starting with f6c00d6bf2a4c2accdb5daa35d7cce5229b8c01e1426167fcd5132dfcf29fbde not found: ID does not exist" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.677174 4475 scope.go:117] "RemoveContainer" containerID="d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb" Dec 03 07:11:07 crc kubenswrapper[4475]: E1203 07:11:07.677418 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb\": container with ID starting with d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb not found: ID does not exist" containerID="d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.677439 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb"} err="failed to get container status \"d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb\": rpc error: code = NotFound desc = could not find container \"d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb\": container with ID starting with d6b57cf1cf81e913f26f613cde1b27a49043808447bd9ffc0369a189a3ef1cfb not found: ID does not exist" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.677470 4475 scope.go:117] "RemoveContainer" containerID="748d415c922151fb1deb71a073c65199494569d60f9c5de94ec9dc2a667b13e6" Dec 03 07:11:07 crc kubenswrapper[4475]: E1203 07:11:07.677774 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"748d415c922151fb1deb71a073c65199494569d60f9c5de94ec9dc2a667b13e6\": container with ID starting with 748d415c922151fb1deb71a073c65199494569d60f9c5de94ec9dc2a667b13e6 not found: ID does not exist" containerID="748d415c922151fb1deb71a073c65199494569d60f9c5de94ec9dc2a667b13e6" Dec 03 07:11:07 crc kubenswrapper[4475]: I1203 07:11:07.677809 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"748d415c922151fb1deb71a073c65199494569d60f9c5de94ec9dc2a667b13e6"} err="failed to get container status \"748d415c922151fb1deb71a073c65199494569d60f9c5de94ec9dc2a667b13e6\": rpc error: code = NotFound desc = could not find container \"748d415c922151fb1deb71a073c65199494569d60f9c5de94ec9dc2a667b13e6\": container with ID starting with 748d415c922151fb1deb71a073c65199494569d60f9c5de94ec9dc2a667b13e6 not found: ID does not exist" Dec 03 07:11:09 crc kubenswrapper[4475]: I1203 07:11:09.498563 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d27e05a2-7d4c-4acb-b31a-084675221a73" path="/var/lib/kubelet/pods/d27e05a2-7d4c-4acb-b31a-084675221a73/volumes" Dec 03 07:11:09 crc kubenswrapper[4475]: I1203 07:11:09.628478 4475 generic.go:334] "Generic (PLEG): container finished" podID="ef8827bc-648a-41fe-a322-bd2c85e0d535" containerID="ef420b60343afef57d2e6ae316142037f3753bbc511a1bbeec8d8cb22529a434" exitCode=0 Dec 03 07:11:09 crc kubenswrapper[4475]: I1203 07:11:09.628515 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" event={"ID":"ef8827bc-648a-41fe-a322-bd2c85e0d535","Type":"ContainerDied","Data":"ef420b60343afef57d2e6ae316142037f3753bbc511a1bbeec8d8cb22529a434"} Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.035881 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.228741 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-ssh-key\") pod \"ef8827bc-648a-41fe-a322-bd2c85e0d535\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.228933 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrkdj\" (UniqueName: \"kubernetes.io/projected/ef8827bc-648a-41fe-a322-bd2c85e0d535-kube-api-access-mrkdj\") pod \"ef8827bc-648a-41fe-a322-bd2c85e0d535\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.228999 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-inventory\") pod \"ef8827bc-648a-41fe-a322-bd2c85e0d535\" (UID: \"ef8827bc-648a-41fe-a322-bd2c85e0d535\") " Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.241537 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8827bc-648a-41fe-a322-bd2c85e0d535-kube-api-access-mrkdj" (OuterVolumeSpecName: "kube-api-access-mrkdj") pod "ef8827bc-648a-41fe-a322-bd2c85e0d535" (UID: "ef8827bc-648a-41fe-a322-bd2c85e0d535"). InnerVolumeSpecName "kube-api-access-mrkdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.250539 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-inventory" (OuterVolumeSpecName: "inventory") pod "ef8827bc-648a-41fe-a322-bd2c85e0d535" (UID: "ef8827bc-648a-41fe-a322-bd2c85e0d535"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.259406 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ef8827bc-648a-41fe-a322-bd2c85e0d535" (UID: "ef8827bc-648a-41fe-a322-bd2c85e0d535"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.331512 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrkdj\" (UniqueName: \"kubernetes.io/projected/ef8827bc-648a-41fe-a322-bd2c85e0d535-kube-api-access-mrkdj\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.331546 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.331557 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef8827bc-648a-41fe-a322-bd2c85e0d535-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.647916 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" event={"ID":"ef8827bc-648a-41fe-a322-bd2c85e0d535","Type":"ContainerDied","Data":"bd1671e719171ebe9632b55dc25e9ca6225897ee63123d42cbae420efa316dae"} Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.647953 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd1671e719171ebe9632b55dc25e9ca6225897ee63123d42cbae420efa316dae" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.648008 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zq5dq" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.711996 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c"] Dec 03 07:11:11 crc kubenswrapper[4475]: E1203 07:11:11.712326 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="facc1808-e75e-4c5e-917b-54ed5965a68c" containerName="registry-server" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.712343 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="facc1808-e75e-4c5e-917b-54ed5965a68c" containerName="registry-server" Dec 03 07:11:11 crc kubenswrapper[4475]: E1203 07:11:11.712362 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8827bc-648a-41fe-a322-bd2c85e0d535" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.712368 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8827bc-648a-41fe-a322-bd2c85e0d535" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 07:11:11 crc kubenswrapper[4475]: E1203 07:11:11.712391 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="facc1808-e75e-4c5e-917b-54ed5965a68c" containerName="extract-content" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.712396 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="facc1808-e75e-4c5e-917b-54ed5965a68c" containerName="extract-content" Dec 03 07:11:11 crc kubenswrapper[4475]: E1203 07:11:11.712410 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27e05a2-7d4c-4acb-b31a-084675221a73" containerName="extract-utilities" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.712417 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27e05a2-7d4c-4acb-b31a-084675221a73" containerName="extract-utilities" Dec 03 07:11:11 crc kubenswrapper[4475]: E1203 07:11:11.712430 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="facc1808-e75e-4c5e-917b-54ed5965a68c" containerName="extract-utilities" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.712435 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="facc1808-e75e-4c5e-917b-54ed5965a68c" containerName="extract-utilities" Dec 03 07:11:11 crc kubenswrapper[4475]: E1203 07:11:11.712445 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27e05a2-7d4c-4acb-b31a-084675221a73" containerName="registry-server" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.712466 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27e05a2-7d4c-4acb-b31a-084675221a73" containerName="registry-server" Dec 03 07:11:11 crc kubenswrapper[4475]: E1203 07:11:11.712476 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27e05a2-7d4c-4acb-b31a-084675221a73" containerName="extract-content" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.712481 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27e05a2-7d4c-4acb-b31a-084675221a73" containerName="extract-content" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.712639 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef8827bc-648a-41fe-a322-bd2c85e0d535" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.712664 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="d27e05a2-7d4c-4acb-b31a-084675221a73" containerName="registry-server" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.712675 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="facc1808-e75e-4c5e-917b-54ed5965a68c" containerName="registry-server" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.713206 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.717699 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.717728 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.719263 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c"] Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.719583 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.719685 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.840989 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.841119 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.841304 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p5fj\" (UniqueName: \"kubernetes.io/projected/19742c79-eb50-4882-b54f-25b8de1c9131-kube-api-access-5p5fj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.942179 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p5fj\" (UniqueName: \"kubernetes.io/projected/19742c79-eb50-4882-b54f-25b8de1c9131-kube-api-access-5p5fj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.942584 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.942674 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.946547 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.948147 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:11:11 crc kubenswrapper[4475]: I1203 07:11:11.955951 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p5fj\" (UniqueName: \"kubernetes.io/projected/19742c79-eb50-4882-b54f-25b8de1c9131-kube-api-access-5p5fj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:11:12 crc kubenswrapper[4475]: I1203 07:11:12.025784 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:11:12 crc kubenswrapper[4475]: I1203 07:11:12.451164 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c"] Dec 03 07:11:12 crc kubenswrapper[4475]: I1203 07:11:12.654994 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" event={"ID":"19742c79-eb50-4882-b54f-25b8de1c9131","Type":"ContainerStarted","Data":"e42c6f4555026663a3f6f9e621cb93f498694b6378f058d8176a1a870fb5043c"} Dec 03 07:11:13 crc kubenswrapper[4475]: I1203 07:11:13.661556 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" event={"ID":"19742c79-eb50-4882-b54f-25b8de1c9131","Type":"ContainerStarted","Data":"5753d5dfac89aa37d6446dbbf3456fcfdfbe9d56d499ffffdffd2cf4a45e16b1"} Dec 03 07:11:13 crc kubenswrapper[4475]: I1203 07:11:13.674437 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" podStartSLOduration=2.159288647 podStartE2EDuration="2.674424196s" podCreationTimestamp="2025-12-03 07:11:11 +0000 UTC" firstStartedPulling="2025-12-03 07:11:12.45887325 +0000 UTC m=+1557.263771584" lastFinishedPulling="2025-12-03 07:11:12.974008799 +0000 UTC m=+1557.778907133" observedRunningTime="2025-12-03 07:11:13.671545889 +0000 UTC m=+1558.476444223" watchObservedRunningTime="2025-12-03 07:11:13.674424196 +0000 UTC m=+1558.479322530" Dec 03 07:11:28 crc kubenswrapper[4475]: I1203 07:11:28.102102 4475 scope.go:117] "RemoveContainer" containerID="33e25a10b7c252046437cebe3572fd74821f908db6872722174c5b44aedb9992" Dec 03 07:11:28 crc kubenswrapper[4475]: I1203 07:11:28.121231 4475 scope.go:117] "RemoveContainer" containerID="4c9e9074106173f250a545c8cbd9265aa037db239d83cf6168e0bcc58ac22e8b" Dec 03 07:11:28 crc kubenswrapper[4475]: I1203 07:11:28.155631 4475 scope.go:117] "RemoveContainer" containerID="f02bdef2568fe1d00657c29fe666cc8ff0d786e5650b0c96bef80a7cb8b0c4ca" Dec 03 07:11:28 crc kubenswrapper[4475]: I1203 07:11:28.186756 4475 scope.go:117] "RemoveContainer" containerID="9aaaa0307bdfcafb53ad99cb2f2a47626244bec1003af7412e10a590a9b7103d" Dec 03 07:11:28 crc kubenswrapper[4475]: I1203 07:11:28.219172 4475 scope.go:117] "RemoveContainer" containerID="74038a224524ed26b69b79829df0410aa86f857470a3f1410e97408d05a1f23c" Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.756161 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5dxvt"] Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.757981 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.762403 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-utilities\") pod \"redhat-operators-5dxvt\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.762533 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvd46\" (UniqueName: \"kubernetes.io/projected/76b8febd-26a9-4c40-a778-8668d7cbeca7-kube-api-access-lvd46\") pod \"redhat-operators-5dxvt\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.762621 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-catalog-content\") pod \"redhat-operators-5dxvt\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.763423 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5dxvt"] Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.864312 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-utilities\") pod \"redhat-operators-5dxvt\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.864403 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvd46\" (UniqueName: \"kubernetes.io/projected/76b8febd-26a9-4c40-a778-8668d7cbeca7-kube-api-access-lvd46\") pod \"redhat-operators-5dxvt\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.864493 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-catalog-content\") pod \"redhat-operators-5dxvt\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.864828 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-utilities\") pod \"redhat-operators-5dxvt\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.864862 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-catalog-content\") pod \"redhat-operators-5dxvt\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:31 crc kubenswrapper[4475]: I1203 07:11:31.879548 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvd46\" (UniqueName: \"kubernetes.io/projected/76b8febd-26a9-4c40-a778-8668d7cbeca7-kube-api-access-lvd46\") pod \"redhat-operators-5dxvt\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.038499 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tbtn5"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.045610 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-t8lwx"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.054723 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-l6t9x"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.061344 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-6ad2-account-create-update-phxm5"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.071914 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-7af4-account-create-update-mpdn4"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.072398 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.091883 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tbtn5"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.098138 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4b91-account-create-update-rv6lt"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.107494 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-l6t9x"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.113559 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-t8lwx"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.118825 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4b91-account-create-update-rv6lt"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.125295 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-7af4-account-create-update-mpdn4"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.130137 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-6ad2-account-create-update-phxm5"] Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.527440 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5dxvt"] Dec 03 07:11:32 crc kubenswrapper[4475]: W1203 07:11:32.530322 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76b8febd_26a9_4c40_a778_8668d7cbeca7.slice/crio-908856cadca580bffaea42c4e93ea2701de0834aa007e8fbe960bb553cae88d6 WatchSource:0}: Error finding container 908856cadca580bffaea42c4e93ea2701de0834aa007e8fbe960bb553cae88d6: Status 404 returned error can't find the container with id 908856cadca580bffaea42c4e93ea2701de0834aa007e8fbe960bb553cae88d6 Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.772738 4475 generic.go:334] "Generic (PLEG): container finished" podID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerID="6ba6525da460b349eabc13f9c0228909e612efb0700e2c67491c47404b2f2989" exitCode=0 Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.772832 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dxvt" event={"ID":"76b8febd-26a9-4c40-a778-8668d7cbeca7","Type":"ContainerDied","Data":"6ba6525da460b349eabc13f9c0228909e612efb0700e2c67491c47404b2f2989"} Dec 03 07:11:32 crc kubenswrapper[4475]: I1203 07:11:32.772937 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dxvt" event={"ID":"76b8febd-26a9-4c40-a778-8668d7cbeca7","Type":"ContainerStarted","Data":"908856cadca580bffaea42c4e93ea2701de0834aa007e8fbe960bb553cae88d6"} Dec 03 07:11:33 crc kubenswrapper[4475]: I1203 07:11:33.499111 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="161d281f-1845-400c-bbda-691d6681cc69" path="/var/lib/kubelet/pods/161d281f-1845-400c-bbda-691d6681cc69/volumes" Dec 03 07:11:33 crc kubenswrapper[4475]: I1203 07:11:33.499859 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37cea503-9fad-48b6-9ec0-b8957e5420f1" path="/var/lib/kubelet/pods/37cea503-9fad-48b6-9ec0-b8957e5420f1/volumes" Dec 03 07:11:33 crc kubenswrapper[4475]: I1203 07:11:33.500854 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ad132bf-e35e-48e6-b406-0aaca969a684" path="/var/lib/kubelet/pods/4ad132bf-e35e-48e6-b406-0aaca969a684/volumes" Dec 03 07:11:33 crc kubenswrapper[4475]: I1203 07:11:33.501409 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5b7dc01-eee2-4b3a-b834-3092016603d3" path="/var/lib/kubelet/pods/d5b7dc01-eee2-4b3a-b834-3092016603d3/volumes" Dec 03 07:11:33 crc kubenswrapper[4475]: I1203 07:11:33.502897 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ece011ea-6da3-49ca-8dd8-b014f2796157" path="/var/lib/kubelet/pods/ece011ea-6da3-49ca-8dd8-b014f2796157/volumes" Dec 03 07:11:33 crc kubenswrapper[4475]: I1203 07:11:33.504148 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae26675-9566-4902-af26-0247c3f6164b" path="/var/lib/kubelet/pods/fae26675-9566-4902-af26-0247c3f6164b/volumes" Dec 03 07:11:33 crc kubenswrapper[4475]: I1203 07:11:33.780532 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dxvt" event={"ID":"76b8febd-26a9-4c40-a778-8668d7cbeca7","Type":"ContainerStarted","Data":"c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6"} Dec 03 07:11:35 crc kubenswrapper[4475]: I1203 07:11:35.794024 4475 generic.go:334] "Generic (PLEG): container finished" podID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerID="c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6" exitCode=0 Dec 03 07:11:35 crc kubenswrapper[4475]: I1203 07:11:35.794096 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dxvt" event={"ID":"76b8febd-26a9-4c40-a778-8668d7cbeca7","Type":"ContainerDied","Data":"c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6"} Dec 03 07:11:36 crc kubenswrapper[4475]: I1203 07:11:36.802702 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dxvt" event={"ID":"76b8febd-26a9-4c40-a778-8668d7cbeca7","Type":"ContainerStarted","Data":"9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584"} Dec 03 07:11:36 crc kubenswrapper[4475]: I1203 07:11:36.818740 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5dxvt" podStartSLOduration=2.324586022 podStartE2EDuration="5.818730215s" podCreationTimestamp="2025-12-03 07:11:31 +0000 UTC" firstStartedPulling="2025-12-03 07:11:32.774297075 +0000 UTC m=+1577.579195409" lastFinishedPulling="2025-12-03 07:11:36.268441268 +0000 UTC m=+1581.073339602" observedRunningTime="2025-12-03 07:11:36.813409951 +0000 UTC m=+1581.618308285" watchObservedRunningTime="2025-12-03 07:11:36.818730215 +0000 UTC m=+1581.623628549" Dec 03 07:11:42 crc kubenswrapper[4475]: I1203 07:11:42.072966 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:42 crc kubenswrapper[4475]: I1203 07:11:42.073308 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:43 crc kubenswrapper[4475]: I1203 07:11:43.103506 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5dxvt" podUID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerName="registry-server" probeResult="failure" output=< Dec 03 07:11:43 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 07:11:43 crc kubenswrapper[4475]: > Dec 03 07:11:52 crc kubenswrapper[4475]: I1203 07:11:52.107368 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:52 crc kubenswrapper[4475]: I1203 07:11:52.141984 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:52 crc kubenswrapper[4475]: I1203 07:11:52.334082 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5dxvt"] Dec 03 07:11:53 crc kubenswrapper[4475]: I1203 07:11:53.911895 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5dxvt" podUID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerName="registry-server" containerID="cri-o://9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584" gracePeriod=2 Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.277321 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.425313 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-catalog-content\") pod \"76b8febd-26a9-4c40-a778-8668d7cbeca7\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.425400 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-utilities\") pod \"76b8febd-26a9-4c40-a778-8668d7cbeca7\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.425688 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvd46\" (UniqueName: \"kubernetes.io/projected/76b8febd-26a9-4c40-a778-8668d7cbeca7-kube-api-access-lvd46\") pod \"76b8febd-26a9-4c40-a778-8668d7cbeca7\" (UID: \"76b8febd-26a9-4c40-a778-8668d7cbeca7\") " Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.426166 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-utilities" (OuterVolumeSpecName: "utilities") pod "76b8febd-26a9-4c40-a778-8668d7cbeca7" (UID: "76b8febd-26a9-4c40-a778-8668d7cbeca7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.431544 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b8febd-26a9-4c40-a778-8668d7cbeca7-kube-api-access-lvd46" (OuterVolumeSpecName: "kube-api-access-lvd46") pod "76b8febd-26a9-4c40-a778-8668d7cbeca7" (UID: "76b8febd-26a9-4c40-a778-8668d7cbeca7"). InnerVolumeSpecName "kube-api-access-lvd46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.511727 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76b8febd-26a9-4c40-a778-8668d7cbeca7" (UID: "76b8febd-26a9-4c40-a778-8668d7cbeca7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.527910 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvd46\" (UniqueName: \"kubernetes.io/projected/76b8febd-26a9-4c40-a778-8668d7cbeca7-kube-api-access-lvd46\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.527953 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.527962 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76b8febd-26a9-4c40-a778-8668d7cbeca7-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.919791 4475 generic.go:334] "Generic (PLEG): container finished" podID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerID="9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584" exitCode=0 Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.919830 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dxvt" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.919842 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dxvt" event={"ID":"76b8febd-26a9-4c40-a778-8668d7cbeca7","Type":"ContainerDied","Data":"9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584"} Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.919872 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dxvt" event={"ID":"76b8febd-26a9-4c40-a778-8668d7cbeca7","Type":"ContainerDied","Data":"908856cadca580bffaea42c4e93ea2701de0834aa007e8fbe960bb553cae88d6"} Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.919888 4475 scope.go:117] "RemoveContainer" containerID="9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.938604 4475 scope.go:117] "RemoveContainer" containerID="c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.945757 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5dxvt"] Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.952365 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5dxvt"] Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.961050 4475 scope.go:117] "RemoveContainer" containerID="6ba6525da460b349eabc13f9c0228909e612efb0700e2c67491c47404b2f2989" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.988073 4475 scope.go:117] "RemoveContainer" containerID="9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584" Dec 03 07:11:54 crc kubenswrapper[4475]: E1203 07:11:54.988602 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584\": container with ID starting with 9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584 not found: ID does not exist" containerID="9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.988634 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584"} err="failed to get container status \"9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584\": rpc error: code = NotFound desc = could not find container \"9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584\": container with ID starting with 9599a21a85c75a45be8c7281d95473a0ad3982b39aad68e2ab10da2043c19584 not found: ID does not exist" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.988658 4475 scope.go:117] "RemoveContainer" containerID="c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6" Dec 03 07:11:54 crc kubenswrapper[4475]: E1203 07:11:54.989029 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6\": container with ID starting with c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6 not found: ID does not exist" containerID="c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.989768 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6"} err="failed to get container status \"c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6\": rpc error: code = NotFound desc = could not find container \"c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6\": container with ID starting with c270802f519fa6993a981491ab881cc3ffdda596c2172aa44b2d5cd1ba1f59f6 not found: ID does not exist" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.989800 4475 scope.go:117] "RemoveContainer" containerID="6ba6525da460b349eabc13f9c0228909e612efb0700e2c67491c47404b2f2989" Dec 03 07:11:54 crc kubenswrapper[4475]: E1203 07:11:54.990017 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ba6525da460b349eabc13f9c0228909e612efb0700e2c67491c47404b2f2989\": container with ID starting with 6ba6525da460b349eabc13f9c0228909e612efb0700e2c67491c47404b2f2989 not found: ID does not exist" containerID="6ba6525da460b349eabc13f9c0228909e612efb0700e2c67491c47404b2f2989" Dec 03 07:11:54 crc kubenswrapper[4475]: I1203 07:11:54.990040 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba6525da460b349eabc13f9c0228909e612efb0700e2c67491c47404b2f2989"} err="failed to get container status \"6ba6525da460b349eabc13f9c0228909e612efb0700e2c67491c47404b2f2989\": rpc error: code = NotFound desc = could not find container \"6ba6525da460b349eabc13f9c0228909e612efb0700e2c67491c47404b2f2989\": container with ID starting with 6ba6525da460b349eabc13f9c0228909e612efb0700e2c67491c47404b2f2989 not found: ID does not exist" Dec 03 07:11:55 crc kubenswrapper[4475]: I1203 07:11:55.502817 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76b8febd-26a9-4c40-a778-8668d7cbeca7" path="/var/lib/kubelet/pods/76b8febd-26a9-4c40-a778-8668d7cbeca7/volumes" Dec 03 07:12:00 crc kubenswrapper[4475]: I1203 07:12:00.032885 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-m4crf"] Dec 03 07:12:00 crc kubenswrapper[4475]: I1203 07:12:00.038465 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-m4crf"] Dec 03 07:12:01 crc kubenswrapper[4475]: I1203 07:12:01.499362 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4afae9fe-ad3f-48d5-a095-9474568f956c" path="/var/lib/kubelet/pods/4afae9fe-ad3f-48d5-a095-9474568f956c/volumes" Dec 03 07:12:13 crc kubenswrapper[4475]: I1203 07:12:13.028442 4475 generic.go:334] "Generic (PLEG): container finished" podID="19742c79-eb50-4882-b54f-25b8de1c9131" containerID="5753d5dfac89aa37d6446dbbf3456fcfdfbe9d56d499ffffdffd2cf4a45e16b1" exitCode=0 Dec 03 07:12:13 crc kubenswrapper[4475]: I1203 07:12:13.028529 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" event={"ID":"19742c79-eb50-4882-b54f-25b8de1c9131","Type":"ContainerDied","Data":"5753d5dfac89aa37d6446dbbf3456fcfdfbe9d56d499ffffdffd2cf4a45e16b1"} Dec 03 07:12:14 crc kubenswrapper[4475]: I1203 07:12:14.348718 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:12:14 crc kubenswrapper[4475]: I1203 07:12:14.528655 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-inventory\") pod \"19742c79-eb50-4882-b54f-25b8de1c9131\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " Dec 03 07:12:14 crc kubenswrapper[4475]: I1203 07:12:14.528710 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-ssh-key\") pod \"19742c79-eb50-4882-b54f-25b8de1c9131\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " Dec 03 07:12:14 crc kubenswrapper[4475]: I1203 07:12:14.528859 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p5fj\" (UniqueName: \"kubernetes.io/projected/19742c79-eb50-4882-b54f-25b8de1c9131-kube-api-access-5p5fj\") pod \"19742c79-eb50-4882-b54f-25b8de1c9131\" (UID: \"19742c79-eb50-4882-b54f-25b8de1c9131\") " Dec 03 07:12:14 crc kubenswrapper[4475]: I1203 07:12:14.533490 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19742c79-eb50-4882-b54f-25b8de1c9131-kube-api-access-5p5fj" (OuterVolumeSpecName: "kube-api-access-5p5fj") pod "19742c79-eb50-4882-b54f-25b8de1c9131" (UID: "19742c79-eb50-4882-b54f-25b8de1c9131"). InnerVolumeSpecName "kube-api-access-5p5fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:14 crc kubenswrapper[4475]: I1203 07:12:14.554201 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-inventory" (OuterVolumeSpecName: "inventory") pod "19742c79-eb50-4882-b54f-25b8de1c9131" (UID: "19742c79-eb50-4882-b54f-25b8de1c9131"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:14 crc kubenswrapper[4475]: I1203 07:12:14.564018 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "19742c79-eb50-4882-b54f-25b8de1c9131" (UID: "19742c79-eb50-4882-b54f-25b8de1c9131"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:14 crc kubenswrapper[4475]: I1203 07:12:14.635690 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:14 crc kubenswrapper[4475]: I1203 07:12:14.635721 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19742c79-eb50-4882-b54f-25b8de1c9131-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:14 crc kubenswrapper[4475]: I1203 07:12:14.635731 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p5fj\" (UniqueName: \"kubernetes.io/projected/19742c79-eb50-4882-b54f-25b8de1c9131-kube-api-access-5p5fj\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.041903 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" event={"ID":"19742c79-eb50-4882-b54f-25b8de1c9131","Type":"ContainerDied","Data":"e42c6f4555026663a3f6f9e621cb93f498694b6378f058d8176a1a870fb5043c"} Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.042389 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e42c6f4555026663a3f6f9e621cb93f498694b6378f058d8176a1a870fb5043c" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.041950 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7wl2c" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.106229 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn"] Dec 03 07:12:15 crc kubenswrapper[4475]: E1203 07:12:15.106526 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19742c79-eb50-4882-b54f-25b8de1c9131" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.106542 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="19742c79-eb50-4882-b54f-25b8de1c9131" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 07:12:15 crc kubenswrapper[4475]: E1203 07:12:15.106554 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerName="extract-utilities" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.106560 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerName="extract-utilities" Dec 03 07:12:15 crc kubenswrapper[4475]: E1203 07:12:15.106583 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerName="extract-content" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.106588 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerName="extract-content" Dec 03 07:12:15 crc kubenswrapper[4475]: E1203 07:12:15.106603 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerName="registry-server" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.106608 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerName="registry-server" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.106760 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="19742c79-eb50-4882-b54f-25b8de1c9131" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.106772 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b8febd-26a9-4c40-a778-8668d7cbeca7" containerName="registry-server" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.107285 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.108948 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.109175 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.109307 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.109475 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.116925 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn"] Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.245513 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.245676 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.245858 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2tc6\" (UniqueName: \"kubernetes.io/projected/11da941e-7a7f-4df1-bf92-6158cb4e67c5-kube-api-access-h2tc6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.347395 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.347503 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.347580 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tc6\" (UniqueName: \"kubernetes.io/projected/11da941e-7a7f-4df1-bf92-6158cb4e67c5-kube-api-access-h2tc6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.351663 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.351698 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.361598 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tc6\" (UniqueName: \"kubernetes.io/projected/11da941e-7a7f-4df1-bf92-6158cb4e67c5-kube-api-access-h2tc6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.422926 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:15 crc kubenswrapper[4475]: I1203 07:12:15.846091 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn"] Dec 03 07:12:16 crc kubenswrapper[4475]: I1203 07:12:16.049595 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" event={"ID":"11da941e-7a7f-4df1-bf92-6158cb4e67c5","Type":"ContainerStarted","Data":"78664319020ebf14c28eab8939f30cfa074d2ffbf105696366b75c284edfce69"} Dec 03 07:12:16 crc kubenswrapper[4475]: I1203 07:12:16.423310 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:12:17 crc kubenswrapper[4475]: I1203 07:12:17.062868 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" event={"ID":"11da941e-7a7f-4df1-bf92-6158cb4e67c5","Type":"ContainerStarted","Data":"a491f7bfd256929ebd959a64c12e8094ed7fa93f75f63291f9491b477a5e9116"} Dec 03 07:12:17 crc kubenswrapper[4475]: I1203 07:12:17.079792 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" podStartSLOduration=1.513528112 podStartE2EDuration="2.079776042s" podCreationTimestamp="2025-12-03 07:12:15 +0000 UTC" firstStartedPulling="2025-12-03 07:12:15.852730804 +0000 UTC m=+1620.657629138" lastFinishedPulling="2025-12-03 07:12:16.418978733 +0000 UTC m=+1621.223877068" observedRunningTime="2025-12-03 07:12:17.074271412 +0000 UTC m=+1621.879169746" watchObservedRunningTime="2025-12-03 07:12:17.079776042 +0000 UTC m=+1621.884674377" Dec 03 07:12:21 crc kubenswrapper[4475]: I1203 07:12:21.036194 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-5gv82"] Dec 03 07:12:21 crc kubenswrapper[4475]: I1203 07:12:21.041753 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4x9fx"] Dec 03 07:12:21 crc kubenswrapper[4475]: I1203 07:12:21.046776 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-5gv82"] Dec 03 07:12:21 crc kubenswrapper[4475]: I1203 07:12:21.053692 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4x9fx"] Dec 03 07:12:21 crc kubenswrapper[4475]: I1203 07:12:21.089949 4475 generic.go:334] "Generic (PLEG): container finished" podID="11da941e-7a7f-4df1-bf92-6158cb4e67c5" containerID="a491f7bfd256929ebd959a64c12e8094ed7fa93f75f63291f9491b477a5e9116" exitCode=0 Dec 03 07:12:21 crc kubenswrapper[4475]: I1203 07:12:21.089985 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" event={"ID":"11da941e-7a7f-4df1-bf92-6158cb4e67c5","Type":"ContainerDied","Data":"a491f7bfd256929ebd959a64c12e8094ed7fa93f75f63291f9491b477a5e9116"} Dec 03 07:12:21 crc kubenswrapper[4475]: I1203 07:12:21.499141 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a09e1c-191b-46b4-92d7-dd92fb839342" path="/var/lib/kubelet/pods/08a09e1c-191b-46b4-92d7-dd92fb839342/volumes" Dec 03 07:12:21 crc kubenswrapper[4475]: I1203 07:12:21.501182 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b51cbc3f-c89a-4e16-814c-381aa017a61f" path="/var/lib/kubelet/pods/b51cbc3f-c89a-4e16-814c-381aa017a61f/volumes" Dec 03 07:12:22 crc kubenswrapper[4475]: I1203 07:12:22.399219 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:22 crc kubenswrapper[4475]: I1203 07:12:22.462835 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-ssh-key\") pod \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " Dec 03 07:12:22 crc kubenswrapper[4475]: I1203 07:12:22.463012 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2tc6\" (UniqueName: \"kubernetes.io/projected/11da941e-7a7f-4df1-bf92-6158cb4e67c5-kube-api-access-h2tc6\") pod \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " Dec 03 07:12:22 crc kubenswrapper[4475]: I1203 07:12:22.463096 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-inventory\") pod \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\" (UID: \"11da941e-7a7f-4df1-bf92-6158cb4e67c5\") " Dec 03 07:12:22 crc kubenswrapper[4475]: I1203 07:12:22.477338 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11da941e-7a7f-4df1-bf92-6158cb4e67c5-kube-api-access-h2tc6" (OuterVolumeSpecName: "kube-api-access-h2tc6") pod "11da941e-7a7f-4df1-bf92-6158cb4e67c5" (UID: "11da941e-7a7f-4df1-bf92-6158cb4e67c5"). InnerVolumeSpecName "kube-api-access-h2tc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:22 crc kubenswrapper[4475]: I1203 07:12:22.483806 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-inventory" (OuterVolumeSpecName: "inventory") pod "11da941e-7a7f-4df1-bf92-6158cb4e67c5" (UID: "11da941e-7a7f-4df1-bf92-6158cb4e67c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:22 crc kubenswrapper[4475]: I1203 07:12:22.484077 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "11da941e-7a7f-4df1-bf92-6158cb4e67c5" (UID: "11da941e-7a7f-4df1-bf92-6158cb4e67c5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:22 crc kubenswrapper[4475]: I1203 07:12:22.565740 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2tc6\" (UniqueName: \"kubernetes.io/projected/11da941e-7a7f-4df1-bf92-6158cb4e67c5-kube-api-access-h2tc6\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:22 crc kubenswrapper[4475]: I1203 07:12:22.565765 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:22 crc kubenswrapper[4475]: I1203 07:12:22.565775 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11da941e-7a7f-4df1-bf92-6158cb4e67c5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.102811 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" event={"ID":"11da941e-7a7f-4df1-bf92-6158cb4e67c5","Type":"ContainerDied","Data":"78664319020ebf14c28eab8939f30cfa074d2ffbf105696366b75c284edfce69"} Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.102850 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78664319020ebf14c28eab8939f30cfa074d2ffbf105696366b75c284edfce69" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.103071 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjrkn" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.162364 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz"] Dec 03 07:12:23 crc kubenswrapper[4475]: E1203 07:12:23.162721 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11da941e-7a7f-4df1-bf92-6158cb4e67c5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.162739 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="11da941e-7a7f-4df1-bf92-6158cb4e67c5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.162943 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="11da941e-7a7f-4df1-bf92-6158cb4e67c5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.163511 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.168919 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.169092 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.169380 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.169630 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.173062 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz"] Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.187239 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7crgz\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.187283 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7crgz\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.187316 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7kx7\" (UniqueName: \"kubernetes.io/projected/7713dba2-dd8c-4104-ba43-08a6aaafe290-kube-api-access-j7kx7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7crgz\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.288606 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7crgz\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.288669 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7crgz\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.288703 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7kx7\" (UniqueName: \"kubernetes.io/projected/7713dba2-dd8c-4104-ba43-08a6aaafe290-kube-api-access-j7kx7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7crgz\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.291507 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7crgz\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.291940 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7crgz\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.301800 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7kx7\" (UniqueName: \"kubernetes.io/projected/7713dba2-dd8c-4104-ba43-08a6aaafe290-kube-api-access-j7kx7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7crgz\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.475322 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:23 crc kubenswrapper[4475]: I1203 07:12:23.876032 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz"] Dec 03 07:12:24 crc kubenswrapper[4475]: I1203 07:12:24.108851 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" event={"ID":"7713dba2-dd8c-4104-ba43-08a6aaafe290","Type":"ContainerStarted","Data":"591a9d4f49792761756f19614cfcb33dbee4b504ba6f88d5b322cb18fa411286"} Dec 03 07:12:25 crc kubenswrapper[4475]: I1203 07:12:25.117327 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" event={"ID":"7713dba2-dd8c-4104-ba43-08a6aaafe290","Type":"ContainerStarted","Data":"eacd2ec46e7b2f8fa44110e24a449562ff5a5b79a8ec02bd904674de0904e66a"} Dec 03 07:12:25 crc kubenswrapper[4475]: I1203 07:12:25.133166 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" podStartSLOduration=1.6469478720000001 podStartE2EDuration="2.133149896s" podCreationTimestamp="2025-12-03 07:12:23 +0000 UTC" firstStartedPulling="2025-12-03 07:12:23.881165998 +0000 UTC m=+1628.686064332" lastFinishedPulling="2025-12-03 07:12:24.367368022 +0000 UTC m=+1629.172266356" observedRunningTime="2025-12-03 07:12:25.129486304 +0000 UTC m=+1629.934384638" watchObservedRunningTime="2025-12-03 07:12:25.133149896 +0000 UTC m=+1629.938048230" Dec 03 07:12:28 crc kubenswrapper[4475]: I1203 07:12:28.370195 4475 scope.go:117] "RemoveContainer" containerID="c990880b947c6666048aaa8c294078b714269dfaf584ac073181ad2c2924030d" Dec 03 07:12:28 crc kubenswrapper[4475]: I1203 07:12:28.386265 4475 scope.go:117] "RemoveContainer" containerID="1c58d9d93461ebc04d6cbedbd057f82787a224606cc13bf3d2487e0d96fb94b9" Dec 03 07:12:28 crc kubenswrapper[4475]: I1203 07:12:28.420145 4475 scope.go:117] "RemoveContainer" containerID="a7c74056fcf164a6b7666d7fe9f0e04744b03a9e45ac1c3b7f550c0b142ad819" Dec 03 07:12:28 crc kubenswrapper[4475]: I1203 07:12:28.451621 4475 scope.go:117] "RemoveContainer" containerID="0c07a33c01541f1d2756e0fd7c843af2ed8229aa4620309e658291066f619218" Dec 03 07:12:28 crc kubenswrapper[4475]: I1203 07:12:28.480247 4475 scope.go:117] "RemoveContainer" containerID="b5be679bc8c01203210fe0862b509d3b6f47528e3debb780752340d07fd241dc" Dec 03 07:12:28 crc kubenswrapper[4475]: I1203 07:12:28.512185 4475 scope.go:117] "RemoveContainer" containerID="339aebb2858f4681fbc7b76e2fb8459c2009b3951b391a0ec64a809d7d17f4ed" Dec 03 07:12:28 crc kubenswrapper[4475]: I1203 07:12:28.541872 4475 scope.go:117] "RemoveContainer" containerID="507404adf0993a3824029651fe87cc72be1488576a6bd17fdafdb492ac344d85" Dec 03 07:12:28 crc kubenswrapper[4475]: I1203 07:12:28.560346 4475 scope.go:117] "RemoveContainer" containerID="36e8e62b17059dfec7d5f9200f2dd8bf1e79c80ecfbeaacd0191064276c95c27" Dec 03 07:12:28 crc kubenswrapper[4475]: I1203 07:12:28.577962 4475 scope.go:117] "RemoveContainer" containerID="ec83ea70093335e49687050a5b3b17b7bd1bc87e08b470bf3d3167448aaa275a" Dec 03 07:12:28 crc kubenswrapper[4475]: I1203 07:12:28.933128 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:12:28 crc kubenswrapper[4475]: I1203 07:12:28.933340 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:12:51 crc kubenswrapper[4475]: I1203 07:12:51.265396 4475 generic.go:334] "Generic (PLEG): container finished" podID="7713dba2-dd8c-4104-ba43-08a6aaafe290" containerID="eacd2ec46e7b2f8fa44110e24a449562ff5a5b79a8ec02bd904674de0904e66a" exitCode=0 Dec 03 07:12:51 crc kubenswrapper[4475]: I1203 07:12:51.265479 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" event={"ID":"7713dba2-dd8c-4104-ba43-08a6aaafe290","Type":"ContainerDied","Data":"eacd2ec46e7b2f8fa44110e24a449562ff5a5b79a8ec02bd904674de0904e66a"} Dec 03 07:12:52 crc kubenswrapper[4475]: I1203 07:12:52.580329 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:52 crc kubenswrapper[4475]: I1203 07:12:52.639519 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-inventory\") pod \"7713dba2-dd8c-4104-ba43-08a6aaafe290\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " Dec 03 07:12:52 crc kubenswrapper[4475]: I1203 07:12:52.639589 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7kx7\" (UniqueName: \"kubernetes.io/projected/7713dba2-dd8c-4104-ba43-08a6aaafe290-kube-api-access-j7kx7\") pod \"7713dba2-dd8c-4104-ba43-08a6aaafe290\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " Dec 03 07:12:52 crc kubenswrapper[4475]: I1203 07:12:52.639644 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-ssh-key\") pod \"7713dba2-dd8c-4104-ba43-08a6aaafe290\" (UID: \"7713dba2-dd8c-4104-ba43-08a6aaafe290\") " Dec 03 07:12:52 crc kubenswrapper[4475]: I1203 07:12:52.644228 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7713dba2-dd8c-4104-ba43-08a6aaafe290-kube-api-access-j7kx7" (OuterVolumeSpecName: "kube-api-access-j7kx7") pod "7713dba2-dd8c-4104-ba43-08a6aaafe290" (UID: "7713dba2-dd8c-4104-ba43-08a6aaafe290"). InnerVolumeSpecName "kube-api-access-j7kx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:52 crc kubenswrapper[4475]: I1203 07:12:52.661113 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7713dba2-dd8c-4104-ba43-08a6aaafe290" (UID: "7713dba2-dd8c-4104-ba43-08a6aaafe290"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:52 crc kubenswrapper[4475]: I1203 07:12:52.661666 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-inventory" (OuterVolumeSpecName: "inventory") pod "7713dba2-dd8c-4104-ba43-08a6aaafe290" (UID: "7713dba2-dd8c-4104-ba43-08a6aaafe290"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:52 crc kubenswrapper[4475]: I1203 07:12:52.741143 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:52 crc kubenswrapper[4475]: I1203 07:12:52.741171 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7kx7\" (UniqueName: \"kubernetes.io/projected/7713dba2-dd8c-4104-ba43-08a6aaafe290-kube-api-access-j7kx7\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:52 crc kubenswrapper[4475]: I1203 07:12:52.741183 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7713dba2-dd8c-4104-ba43-08a6aaafe290-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.278829 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" event={"ID":"7713dba2-dd8c-4104-ba43-08a6aaafe290","Type":"ContainerDied","Data":"591a9d4f49792761756f19614cfcb33dbee4b504ba6f88d5b322cb18fa411286"} Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.278868 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="591a9d4f49792761756f19614cfcb33dbee4b504ba6f88d5b322cb18fa411286" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.278874 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7crgz" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.341429 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d"] Dec 03 07:12:53 crc kubenswrapper[4475]: E1203 07:12:53.341765 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7713dba2-dd8c-4104-ba43-08a6aaafe290" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.341782 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7713dba2-dd8c-4104-ba43-08a6aaafe290" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.341968 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="7713dba2-dd8c-4104-ba43-08a6aaafe290" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.342572 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.344536 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.345178 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.345683 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.347032 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.348921 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kd27d\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.348965 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kd27d\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.349026 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vpj4\" (UniqueName: \"kubernetes.io/projected/16e37384-2f4c-42e6-be24-2666f1399808-kube-api-access-7vpj4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kd27d\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.354186 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d"] Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.450676 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kd27d\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.450712 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kd27d\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.450744 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vpj4\" (UniqueName: \"kubernetes.io/projected/16e37384-2f4c-42e6-be24-2666f1399808-kube-api-access-7vpj4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kd27d\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.453938 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kd27d\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.454116 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kd27d\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.464500 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vpj4\" (UniqueName: \"kubernetes.io/projected/16e37384-2f4c-42e6-be24-2666f1399808-kube-api-access-7vpj4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kd27d\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:12:53 crc kubenswrapper[4475]: I1203 07:12:53.658357 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:12:54 crc kubenswrapper[4475]: I1203 07:12:54.089040 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d"] Dec 03 07:12:54 crc kubenswrapper[4475]: I1203 07:12:54.285403 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" event={"ID":"16e37384-2f4c-42e6-be24-2666f1399808","Type":"ContainerStarted","Data":"aede639a775109d7c4fe790f83b02000dd5f46ae8c9f4e09f1a8e9164a54f4a2"} Dec 03 07:12:55 crc kubenswrapper[4475]: I1203 07:12:55.292726 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" event={"ID":"16e37384-2f4c-42e6-be24-2666f1399808","Type":"ContainerStarted","Data":"52b44c6a62808930f054ea8a4279a202c3962727e2dac3ed4e64c6642186f46b"} Dec 03 07:12:55 crc kubenswrapper[4475]: I1203 07:12:55.308982 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" podStartSLOduration=1.8242017640000001 podStartE2EDuration="2.308968858s" podCreationTimestamp="2025-12-03 07:12:53 +0000 UTC" firstStartedPulling="2025-12-03 07:12:54.095739528 +0000 UTC m=+1658.900637862" lastFinishedPulling="2025-12-03 07:12:54.580506622 +0000 UTC m=+1659.385404956" observedRunningTime="2025-12-03 07:12:55.304576858 +0000 UTC m=+1660.109475192" watchObservedRunningTime="2025-12-03 07:12:55.308968858 +0000 UTC m=+1660.113867192" Dec 03 07:12:58 crc kubenswrapper[4475]: I1203 07:12:58.933240 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:12:58 crc kubenswrapper[4475]: I1203 07:12:58.933674 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:13:05 crc kubenswrapper[4475]: I1203 07:13:05.031613 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-24dq8"] Dec 03 07:13:05 crc kubenswrapper[4475]: I1203 07:13:05.037995 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-24dq8"] Dec 03 07:13:05 crc kubenswrapper[4475]: I1203 07:13:05.498568 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb4749d8-e5e2-488a-a007-028e214d6d95" path="/var/lib/kubelet/pods/bb4749d8-e5e2-488a-a007-028e214d6d95/volumes" Dec 03 07:13:28 crc kubenswrapper[4475]: I1203 07:13:28.704844 4475 scope.go:117] "RemoveContainer" containerID="51c1ad9b86c78f206c8c5b233b107533a9b98c853d733106cb113c9c4dd81e1e" Dec 03 07:13:28 crc kubenswrapper[4475]: I1203 07:13:28.933299 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:13:28 crc kubenswrapper[4475]: I1203 07:13:28.933352 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:13:28 crc kubenswrapper[4475]: I1203 07:13:28.933392 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 07:13:28 crc kubenswrapper[4475]: I1203 07:13:28.934042 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:13:28 crc kubenswrapper[4475]: I1203 07:13:28.934104 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" gracePeriod=600 Dec 03 07:13:29 crc kubenswrapper[4475]: E1203 07:13:29.055151 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:13:29 crc kubenswrapper[4475]: I1203 07:13:29.499807 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" exitCode=0 Dec 03 07:13:29 crc kubenswrapper[4475]: I1203 07:13:29.500304 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104"} Dec 03 07:13:29 crc kubenswrapper[4475]: I1203 07:13:29.500353 4475 scope.go:117] "RemoveContainer" containerID="3e6ae2a6419a9cebd1f1dffa9cdf8c8a5acdb2bcb9ceb927f45c4e93564f5359" Dec 03 07:13:29 crc kubenswrapper[4475]: I1203 07:13:29.500733 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:13:29 crc kubenswrapper[4475]: E1203 07:13:29.501067 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:13:30 crc kubenswrapper[4475]: I1203 07:13:30.508661 4475 generic.go:334] "Generic (PLEG): container finished" podID="16e37384-2f4c-42e6-be24-2666f1399808" containerID="52b44c6a62808930f054ea8a4279a202c3962727e2dac3ed4e64c6642186f46b" exitCode=0 Dec 03 07:13:30 crc kubenswrapper[4475]: I1203 07:13:30.508686 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" event={"ID":"16e37384-2f4c-42e6-be24-2666f1399808","Type":"ContainerDied","Data":"52b44c6a62808930f054ea8a4279a202c3962727e2dac3ed4e64c6642186f46b"} Dec 03 07:13:31 crc kubenswrapper[4475]: I1203 07:13:31.801527 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:13:31 crc kubenswrapper[4475]: I1203 07:13:31.889202 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vpj4\" (UniqueName: \"kubernetes.io/projected/16e37384-2f4c-42e6-be24-2666f1399808-kube-api-access-7vpj4\") pod \"16e37384-2f4c-42e6-be24-2666f1399808\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " Dec 03 07:13:31 crc kubenswrapper[4475]: I1203 07:13:31.889308 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-inventory\") pod \"16e37384-2f4c-42e6-be24-2666f1399808\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " Dec 03 07:13:31 crc kubenswrapper[4475]: I1203 07:13:31.889338 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-ssh-key\") pod \"16e37384-2f4c-42e6-be24-2666f1399808\" (UID: \"16e37384-2f4c-42e6-be24-2666f1399808\") " Dec 03 07:13:31 crc kubenswrapper[4475]: I1203 07:13:31.893295 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16e37384-2f4c-42e6-be24-2666f1399808-kube-api-access-7vpj4" (OuterVolumeSpecName: "kube-api-access-7vpj4") pod "16e37384-2f4c-42e6-be24-2666f1399808" (UID: "16e37384-2f4c-42e6-be24-2666f1399808"). InnerVolumeSpecName "kube-api-access-7vpj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:13:31 crc kubenswrapper[4475]: I1203 07:13:31.910916 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "16e37384-2f4c-42e6-be24-2666f1399808" (UID: "16e37384-2f4c-42e6-be24-2666f1399808"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:13:31 crc kubenswrapper[4475]: I1203 07:13:31.912681 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-inventory" (OuterVolumeSpecName: "inventory") pod "16e37384-2f4c-42e6-be24-2666f1399808" (UID: "16e37384-2f4c-42e6-be24-2666f1399808"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:13:31 crc kubenswrapper[4475]: I1203 07:13:31.991330 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vpj4\" (UniqueName: \"kubernetes.io/projected/16e37384-2f4c-42e6-be24-2666f1399808-kube-api-access-7vpj4\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:31 crc kubenswrapper[4475]: I1203 07:13:31.991357 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:31 crc kubenswrapper[4475]: I1203 07:13:31.991367 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16e37384-2f4c-42e6-be24-2666f1399808-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.521504 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" event={"ID":"16e37384-2f4c-42e6-be24-2666f1399808","Type":"ContainerDied","Data":"aede639a775109d7c4fe790f83b02000dd5f46ae8c9f4e09f1a8e9164a54f4a2"} Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.521706 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aede639a775109d7c4fe790f83b02000dd5f46ae8c9f4e09f1a8e9164a54f4a2" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.521571 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kd27d" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.582320 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-db59h"] Dec 03 07:13:32 crc kubenswrapper[4475]: E1203 07:13:32.582722 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e37384-2f4c-42e6-be24-2666f1399808" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.582739 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e37384-2f4c-42e6-be24-2666f1399808" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.582951 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e37384-2f4c-42e6-be24-2666f1399808" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.583593 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.585990 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.586960 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.587146 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.587187 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.591345 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-db59h"] Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.600857 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsncp\" (UniqueName: \"kubernetes.io/projected/4821d991-aeb9-479a-8340-640cef33b9ae-kube-api-access-gsncp\") pod \"ssh-known-hosts-edpm-deployment-db59h\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.601040 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-db59h\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.601143 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-db59h\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.702939 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-db59h\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.703013 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-db59h\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.703088 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsncp\" (UniqueName: \"kubernetes.io/projected/4821d991-aeb9-479a-8340-640cef33b9ae-kube-api-access-gsncp\") pod \"ssh-known-hosts-edpm-deployment-db59h\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.705953 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-db59h\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.706387 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-db59h\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.716826 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsncp\" (UniqueName: \"kubernetes.io/projected/4821d991-aeb9-479a-8340-640cef33b9ae-kube-api-access-gsncp\") pod \"ssh-known-hosts-edpm-deployment-db59h\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:32 crc kubenswrapper[4475]: I1203 07:13:32.897570 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:33 crc kubenswrapper[4475]: I1203 07:13:33.306866 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-db59h"] Dec 03 07:13:33 crc kubenswrapper[4475]: I1203 07:13:33.528213 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-db59h" event={"ID":"4821d991-aeb9-479a-8340-640cef33b9ae","Type":"ContainerStarted","Data":"5937df74435a8a22f1294a3db2f7d74acda24a832b32905728008d99646e9f7e"} Dec 03 07:13:34 crc kubenswrapper[4475]: I1203 07:13:34.550544 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-db59h" event={"ID":"4821d991-aeb9-479a-8340-640cef33b9ae","Type":"ContainerStarted","Data":"7b12552f44deec01c628585f1072dbd5bbed0346f18fde712bf9e25bb6c69a85"} Dec 03 07:13:34 crc kubenswrapper[4475]: I1203 07:13:34.568507 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-db59h" podStartSLOduration=2.117860288 podStartE2EDuration="2.568491619s" podCreationTimestamp="2025-12-03 07:13:32 +0000 UTC" firstStartedPulling="2025-12-03 07:13:33.308424731 +0000 UTC m=+1698.113323065" lastFinishedPulling="2025-12-03 07:13:33.759056062 +0000 UTC m=+1698.563954396" observedRunningTime="2025-12-03 07:13:34.563489562 +0000 UTC m=+1699.368387906" watchObservedRunningTime="2025-12-03 07:13:34.568491619 +0000 UTC m=+1699.373389954" Dec 03 07:13:39 crc kubenswrapper[4475]: I1203 07:13:39.580713 4475 generic.go:334] "Generic (PLEG): container finished" podID="4821d991-aeb9-479a-8340-640cef33b9ae" containerID="7b12552f44deec01c628585f1072dbd5bbed0346f18fde712bf9e25bb6c69a85" exitCode=0 Dec 03 07:13:39 crc kubenswrapper[4475]: I1203 07:13:39.580777 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-db59h" event={"ID":"4821d991-aeb9-479a-8340-640cef33b9ae","Type":"ContainerDied","Data":"7b12552f44deec01c628585f1072dbd5bbed0346f18fde712bf9e25bb6c69a85"} Dec 03 07:13:40 crc kubenswrapper[4475]: I1203 07:13:40.874542 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.038609 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-inventory-0\") pod \"4821d991-aeb9-479a-8340-640cef33b9ae\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.038758 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-ssh-key-openstack-edpm-ipam\") pod \"4821d991-aeb9-479a-8340-640cef33b9ae\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.038906 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsncp\" (UniqueName: \"kubernetes.io/projected/4821d991-aeb9-479a-8340-640cef33b9ae-kube-api-access-gsncp\") pod \"4821d991-aeb9-479a-8340-640cef33b9ae\" (UID: \"4821d991-aeb9-479a-8340-640cef33b9ae\") " Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.043337 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4821d991-aeb9-479a-8340-640cef33b9ae-kube-api-access-gsncp" (OuterVolumeSpecName: "kube-api-access-gsncp") pod "4821d991-aeb9-479a-8340-640cef33b9ae" (UID: "4821d991-aeb9-479a-8340-640cef33b9ae"). InnerVolumeSpecName "kube-api-access-gsncp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.059007 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4821d991-aeb9-479a-8340-640cef33b9ae" (UID: "4821d991-aeb9-479a-8340-640cef33b9ae"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.059300 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4821d991-aeb9-479a-8340-640cef33b9ae" (UID: "4821d991-aeb9-479a-8340-640cef33b9ae"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.140746 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.140785 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsncp\" (UniqueName: \"kubernetes.io/projected/4821d991-aeb9-479a-8340-640cef33b9ae-kube-api-access-gsncp\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.140797 4475 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4821d991-aeb9-479a-8340-640cef33b9ae-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.592824 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-db59h" event={"ID":"4821d991-aeb9-479a-8340-640cef33b9ae","Type":"ContainerDied","Data":"5937df74435a8a22f1294a3db2f7d74acda24a832b32905728008d99646e9f7e"} Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.593036 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5937df74435a8a22f1294a3db2f7d74acda24a832b32905728008d99646e9f7e" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.592851 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-db59h" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.644193 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p"] Dec 03 07:13:41 crc kubenswrapper[4475]: E1203 07:13:41.644551 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4821d991-aeb9-479a-8340-640cef33b9ae" containerName="ssh-known-hosts-edpm-deployment" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.644568 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="4821d991-aeb9-479a-8340-640cef33b9ae" containerName="ssh-known-hosts-edpm-deployment" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.644757 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="4821d991-aeb9-479a-8340-640cef33b9ae" containerName="ssh-known-hosts-edpm-deployment" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.645272 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.648003 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.648410 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.648727 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.650874 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.651039 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p"] Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.751025 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x268p\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.751403 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hg7f\" (UniqueName: \"kubernetes.io/projected/5a731f88-ae84-4c9c-bde9-029db703de0c-kube-api-access-4hg7f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x268p\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.751578 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x268p\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.852972 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x268p\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.853076 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x268p\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.853204 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hg7f\" (UniqueName: \"kubernetes.io/projected/5a731f88-ae84-4c9c-bde9-029db703de0c-kube-api-access-4hg7f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x268p\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.856710 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x268p\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.858924 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x268p\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.866162 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hg7f\" (UniqueName: \"kubernetes.io/projected/5a731f88-ae84-4c9c-bde9-029db703de0c-kube-api-access-4hg7f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x268p\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:41 crc kubenswrapper[4475]: I1203 07:13:41.960554 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:42 crc kubenswrapper[4475]: I1203 07:13:42.376023 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p"] Dec 03 07:13:42 crc kubenswrapper[4475]: I1203 07:13:42.607336 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" event={"ID":"5a731f88-ae84-4c9c-bde9-029db703de0c","Type":"ContainerStarted","Data":"071db8801ef951e12093d1093e57bf8a5a7d94bc1b4c055070d1136394fe9cac"} Dec 03 07:13:43 crc kubenswrapper[4475]: I1203 07:13:43.492589 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:13:43 crc kubenswrapper[4475]: E1203 07:13:43.493602 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:13:43 crc kubenswrapper[4475]: I1203 07:13:43.616714 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" event={"ID":"5a731f88-ae84-4c9c-bde9-029db703de0c","Type":"ContainerStarted","Data":"9c7bec5cf83a5f20b4e28424e00ad6d0f02223c263310bf37af1486f8fea52e0"} Dec 03 07:13:43 crc kubenswrapper[4475]: I1203 07:13:43.633637 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" podStartSLOduration=2.186337844 podStartE2EDuration="2.633624245s" podCreationTimestamp="2025-12-03 07:13:41 +0000 UTC" firstStartedPulling="2025-12-03 07:13:42.381745315 +0000 UTC m=+1707.186643639" lastFinishedPulling="2025-12-03 07:13:42.829031706 +0000 UTC m=+1707.633930040" observedRunningTime="2025-12-03 07:13:43.628211305 +0000 UTC m=+1708.433109639" watchObservedRunningTime="2025-12-03 07:13:43.633624245 +0000 UTC m=+1708.438522579" Dec 03 07:13:49 crc kubenswrapper[4475]: I1203 07:13:49.659759 4475 generic.go:334] "Generic (PLEG): container finished" podID="5a731f88-ae84-4c9c-bde9-029db703de0c" containerID="9c7bec5cf83a5f20b4e28424e00ad6d0f02223c263310bf37af1486f8fea52e0" exitCode=0 Dec 03 07:13:49 crc kubenswrapper[4475]: I1203 07:13:49.659842 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" event={"ID":"5a731f88-ae84-4c9c-bde9-029db703de0c","Type":"ContainerDied","Data":"9c7bec5cf83a5f20b4e28424e00ad6d0f02223c263310bf37af1486f8fea52e0"} Dec 03 07:13:50 crc kubenswrapper[4475]: I1203 07:13:50.981190 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.117672 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hg7f\" (UniqueName: \"kubernetes.io/projected/5a731f88-ae84-4c9c-bde9-029db703de0c-kube-api-access-4hg7f\") pod \"5a731f88-ae84-4c9c-bde9-029db703de0c\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.117950 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-inventory\") pod \"5a731f88-ae84-4c9c-bde9-029db703de0c\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.118129 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-ssh-key\") pod \"5a731f88-ae84-4c9c-bde9-029db703de0c\" (UID: \"5a731f88-ae84-4c9c-bde9-029db703de0c\") " Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.124550 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a731f88-ae84-4c9c-bde9-029db703de0c-kube-api-access-4hg7f" (OuterVolumeSpecName: "kube-api-access-4hg7f") pod "5a731f88-ae84-4c9c-bde9-029db703de0c" (UID: "5a731f88-ae84-4c9c-bde9-029db703de0c"). InnerVolumeSpecName "kube-api-access-4hg7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.138980 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5a731f88-ae84-4c9c-bde9-029db703de0c" (UID: "5a731f88-ae84-4c9c-bde9-029db703de0c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.139990 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-inventory" (OuterVolumeSpecName: "inventory") pod "5a731f88-ae84-4c9c-bde9-029db703de0c" (UID: "5a731f88-ae84-4c9c-bde9-029db703de0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.220861 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.220893 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hg7f\" (UniqueName: \"kubernetes.io/projected/5a731f88-ae84-4c9c-bde9-029db703de0c-kube-api-access-4hg7f\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.220904 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a731f88-ae84-4c9c-bde9-029db703de0c-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.673497 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.673370 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x268p" event={"ID":"5a731f88-ae84-4c9c-bde9-029db703de0c","Type":"ContainerDied","Data":"071db8801ef951e12093d1093e57bf8a5a7d94bc1b4c055070d1136394fe9cac"} Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.674552 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="071db8801ef951e12093d1093e57bf8a5a7d94bc1b4c055070d1136394fe9cac" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.738896 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4"] Dec 03 07:13:51 crc kubenswrapper[4475]: E1203 07:13:51.739216 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a731f88-ae84-4c9c-bde9-029db703de0c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.739232 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a731f88-ae84-4c9c-bde9-029db703de0c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.739411 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a731f88-ae84-4c9c-bde9-029db703de0c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.739918 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.742069 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.743054 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.743474 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.743569 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.755094 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4"] Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.831318 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.831516 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwmk5\" (UniqueName: \"kubernetes.io/projected/03c7bf7a-4f44-4232-81c2-8f7f31236781-kube-api-access-cwmk5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.831653 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.933438 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.933599 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwmk5\" (UniqueName: \"kubernetes.io/projected/03c7bf7a-4f44-4232-81c2-8f7f31236781-kube-api-access-cwmk5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.933660 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.936499 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.946625 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:13:51 crc kubenswrapper[4475]: I1203 07:13:51.946712 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwmk5\" (UniqueName: \"kubernetes.io/projected/03c7bf7a-4f44-4232-81c2-8f7f31236781-kube-api-access-cwmk5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:13:52 crc kubenswrapper[4475]: I1203 07:13:52.052763 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:13:52 crc kubenswrapper[4475]: I1203 07:13:52.492178 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4"] Dec 03 07:13:52 crc kubenswrapper[4475]: I1203 07:13:52.679230 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" event={"ID":"03c7bf7a-4f44-4232-81c2-8f7f31236781","Type":"ContainerStarted","Data":"dc5730cc006dc2684713bf2f87fd6e8d9df617935e32e00062bee89cc3ccddeb"} Dec 03 07:13:53 crc kubenswrapper[4475]: I1203 07:13:53.687175 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" event={"ID":"03c7bf7a-4f44-4232-81c2-8f7f31236781","Type":"ContainerStarted","Data":"7799c14a44d43aab19df114217c7addba9130a6fae2d4b4710903c2282a4040f"} Dec 03 07:13:53 crc kubenswrapper[4475]: I1203 07:13:53.701579 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" podStartSLOduration=2.276749408 podStartE2EDuration="2.70156689s" podCreationTimestamp="2025-12-03 07:13:51 +0000 UTC" firstStartedPulling="2025-12-03 07:13:52.498803505 +0000 UTC m=+1717.303701839" lastFinishedPulling="2025-12-03 07:13:52.923620987 +0000 UTC m=+1717.728519321" observedRunningTime="2025-12-03 07:13:53.697048162 +0000 UTC m=+1718.501946496" watchObservedRunningTime="2025-12-03 07:13:53.70156689 +0000 UTC m=+1718.506465223" Dec 03 07:13:57 crc kubenswrapper[4475]: I1203 07:13:57.491959 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:13:57 crc kubenswrapper[4475]: E1203 07:13:57.492510 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:14:00 crc kubenswrapper[4475]: I1203 07:14:00.726845 4475 generic.go:334] "Generic (PLEG): container finished" podID="03c7bf7a-4f44-4232-81c2-8f7f31236781" containerID="7799c14a44d43aab19df114217c7addba9130a6fae2d4b4710903c2282a4040f" exitCode=0 Dec 03 07:14:00 crc kubenswrapper[4475]: I1203 07:14:00.727031 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" event={"ID":"03c7bf7a-4f44-4232-81c2-8f7f31236781","Type":"ContainerDied","Data":"7799c14a44d43aab19df114217c7addba9130a6fae2d4b4710903c2282a4040f"} Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.023158 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.192696 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwmk5\" (UniqueName: \"kubernetes.io/projected/03c7bf7a-4f44-4232-81c2-8f7f31236781-kube-api-access-cwmk5\") pod \"03c7bf7a-4f44-4232-81c2-8f7f31236781\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.192880 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-ssh-key\") pod \"03c7bf7a-4f44-4232-81c2-8f7f31236781\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.192981 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-inventory\") pod \"03c7bf7a-4f44-4232-81c2-8f7f31236781\" (UID: \"03c7bf7a-4f44-4232-81c2-8f7f31236781\") " Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.196847 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03c7bf7a-4f44-4232-81c2-8f7f31236781-kube-api-access-cwmk5" (OuterVolumeSpecName: "kube-api-access-cwmk5") pod "03c7bf7a-4f44-4232-81c2-8f7f31236781" (UID: "03c7bf7a-4f44-4232-81c2-8f7f31236781"). InnerVolumeSpecName "kube-api-access-cwmk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.213222 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "03c7bf7a-4f44-4232-81c2-8f7f31236781" (UID: "03c7bf7a-4f44-4232-81c2-8f7f31236781"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.213921 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-inventory" (OuterVolumeSpecName: "inventory") pod "03c7bf7a-4f44-4232-81c2-8f7f31236781" (UID: "03c7bf7a-4f44-4232-81c2-8f7f31236781"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.295036 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.295207 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwmk5\" (UniqueName: \"kubernetes.io/projected/03c7bf7a-4f44-4232-81c2-8f7f31236781-kube-api-access-cwmk5\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.295288 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03c7bf7a-4f44-4232-81c2-8f7f31236781-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.740019 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" event={"ID":"03c7bf7a-4f44-4232-81c2-8f7f31236781","Type":"ContainerDied","Data":"dc5730cc006dc2684713bf2f87fd6e8d9df617935e32e00062bee89cc3ccddeb"} Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.740054 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc5730cc006dc2684713bf2f87fd6e8d9df617935e32e00062bee89cc3ccddeb" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.740130 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ls5n4" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.804981 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw"] Dec 03 07:14:02 crc kubenswrapper[4475]: E1203 07:14:02.805466 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c7bf7a-4f44-4232-81c2-8f7f31236781" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.805550 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c7bf7a-4f44-4232-81c2-8f7f31236781" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.805829 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c7bf7a-4f44-4232-81c2-8f7f31236781" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.806443 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.812184 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.812188 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.812393 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.812396 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.812443 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.812573 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.812763 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.812906 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.826437 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw"] Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902256 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902321 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902402 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902446 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902532 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902589 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902633 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwkx2\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-kube-api-access-pwkx2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902679 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902706 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902724 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902756 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902775 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902869 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:02 crc kubenswrapper[4475]: I1203 07:14:02.902918 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.003803 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.003841 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.003877 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.003914 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.003955 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.003986 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.004024 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.004066 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.004098 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwkx2\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-kube-api-access-pwkx2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.004147 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.004171 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.004189 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.004213 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.004239 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.007596 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.008282 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.008378 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.008732 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.008834 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.009381 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.009437 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.009914 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.010061 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.010572 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.011108 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.011777 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.012726 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.017333 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwkx2\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-kube-api-access-pwkx2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.125627 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.586391 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw"] Dec 03 07:14:03 crc kubenswrapper[4475]: I1203 07:14:03.746582 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" event={"ID":"7b0b38b7-d49c-4a53-aa77-08aef7b4b059","Type":"ContainerStarted","Data":"bfdeb4dce7ddcaebcfebdbef0b37f3975714c5c1ea8fb9b76066d8d669c69323"} Dec 03 07:14:04 crc kubenswrapper[4475]: I1203 07:14:04.752860 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" event={"ID":"7b0b38b7-d49c-4a53-aa77-08aef7b4b059","Type":"ContainerStarted","Data":"c496000ee59c8d5c86e73f5d6ca772d45c491e8833d85e350460528d312cbe38"} Dec 03 07:14:04 crc kubenswrapper[4475]: I1203 07:14:04.772020 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" podStartSLOduration=2.278046944 podStartE2EDuration="2.772006686s" podCreationTimestamp="2025-12-03 07:14:02 +0000 UTC" firstStartedPulling="2025-12-03 07:14:03.588255189 +0000 UTC m=+1728.393153522" lastFinishedPulling="2025-12-03 07:14:04.08221493 +0000 UTC m=+1728.887113264" observedRunningTime="2025-12-03 07:14:04.768675911 +0000 UTC m=+1729.573574246" watchObservedRunningTime="2025-12-03 07:14:04.772006686 +0000 UTC m=+1729.576905021" Dec 03 07:14:11 crc kubenswrapper[4475]: I1203 07:14:11.490768 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:14:11 crc kubenswrapper[4475]: E1203 07:14:11.491221 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:14:24 crc kubenswrapper[4475]: I1203 07:14:24.491950 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:14:24 crc kubenswrapper[4475]: E1203 07:14:24.492483 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:14:30 crc kubenswrapper[4475]: I1203 07:14:30.907891 4475 generic.go:334] "Generic (PLEG): container finished" podID="7b0b38b7-d49c-4a53-aa77-08aef7b4b059" containerID="c496000ee59c8d5c86e73f5d6ca772d45c491e8833d85e350460528d312cbe38" exitCode=0 Dec 03 07:14:30 crc kubenswrapper[4475]: I1203 07:14:30.907967 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" event={"ID":"7b0b38b7-d49c-4a53-aa77-08aef7b4b059","Type":"ContainerDied","Data":"c496000ee59c8d5c86e73f5d6ca772d45c491e8833d85e350460528d312cbe38"} Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.206352 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.368910 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.368956 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-neutron-metadata-combined-ca-bundle\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.368985 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-nova-combined-ca-bundle\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.369016 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-libvirt-combined-ca-bundle\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.369045 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.369073 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.369106 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ssh-key\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.369150 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.369184 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ovn-combined-ca-bundle\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.369235 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwkx2\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-kube-api-access-pwkx2\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.369329 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-repo-setup-combined-ca-bundle\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.369362 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-bootstrap-combined-ca-bundle\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.369383 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-telemetry-combined-ca-bundle\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.369413 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-inventory\") pod \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\" (UID: \"7b0b38b7-d49c-4a53-aa77-08aef7b4b059\") " Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.375845 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.376053 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.376129 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.376253 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.376357 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.376653 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.376958 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.377389 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.377713 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-kube-api-access-pwkx2" (OuterVolumeSpecName: "kube-api-access-pwkx2") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "kube-api-access-pwkx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.378536 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.378626 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.382388 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.396691 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.396883 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-inventory" (OuterVolumeSpecName: "inventory") pod "7b0b38b7-d49c-4a53-aa77-08aef7b4b059" (UID: "7b0b38b7-d49c-4a53-aa77-08aef7b4b059"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471109 4475 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471140 4475 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471150 4475 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471159 4475 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471169 4475 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471178 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471186 4475 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471195 4475 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471202 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwkx2\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-kube-api-access-pwkx2\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471217 4475 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471225 4475 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471233 4475 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471242 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.471249 4475 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7b0b38b7-d49c-4a53-aa77-08aef7b4b059-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.923277 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" event={"ID":"7b0b38b7-d49c-4a53-aa77-08aef7b4b059","Type":"ContainerDied","Data":"bfdeb4dce7ddcaebcfebdbef0b37f3975714c5c1ea8fb9b76066d8d669c69323"} Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.923310 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfdeb4dce7ddcaebcfebdbef0b37f3975714c5c1ea8fb9b76066d8d669c69323" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.923340 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xbkjw" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.999508 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p"] Dec 03 07:14:32 crc kubenswrapper[4475]: E1203 07:14:32.999819 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0b38b7-d49c-4a53-aa77-08aef7b4b059" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 07:14:32 crc kubenswrapper[4475]: I1203 07:14:32.999841 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0b38b7-d49c-4a53-aa77-08aef7b4b059" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.000041 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0b38b7-d49c-4a53-aa77-08aef7b4b059" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.000591 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.002022 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.003011 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.003346 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.003677 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.004673 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.019264 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p"] Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.079888 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.079931 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.080008 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnbgd\" (UniqueName: \"kubernetes.io/projected/ea23edf8-c790-42f0-80cd-b62fdd382faa-kube-api-access-qnbgd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.080172 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.080337 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.181858 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.181907 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.181943 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnbgd\" (UniqueName: \"kubernetes.io/projected/ea23edf8-c790-42f0-80cd-b62fdd382faa-kube-api-access-qnbgd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.181976 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.182311 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.182785 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.184756 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.185405 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.186091 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.195965 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnbgd\" (UniqueName: \"kubernetes.io/projected/ea23edf8-c790-42f0-80cd-b62fdd382faa-kube-api-access-qnbgd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-f9z7p\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.318554 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.732439 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p"] Dec 03 07:14:33 crc kubenswrapper[4475]: I1203 07:14:33.930717 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" event={"ID":"ea23edf8-c790-42f0-80cd-b62fdd382faa","Type":"ContainerStarted","Data":"24b92e464f9a000d3c527f8638f124dbdae3898d5e07cbc972ec037cd15f1235"} Dec 03 07:14:34 crc kubenswrapper[4475]: I1203 07:14:34.938405 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" event={"ID":"ea23edf8-c790-42f0-80cd-b62fdd382faa","Type":"ContainerStarted","Data":"4d65ec2e9f1db710d7ebaa89138bc99573fd15e2e1f668ef8550b96880c9021f"} Dec 03 07:14:37 crc kubenswrapper[4475]: I1203 07:14:37.490923 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:14:37 crc kubenswrapper[4475]: E1203 07:14:37.491630 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:14:48 crc kubenswrapper[4475]: I1203 07:14:48.491710 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:14:48 crc kubenswrapper[4475]: E1203 07:14:48.492228 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:14:59 crc kubenswrapper[4475]: I1203 07:14:59.491626 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:14:59 crc kubenswrapper[4475]: E1203 07:14:59.492164 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.135018 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" podStartSLOduration=27.56069541 podStartE2EDuration="28.135004816s" podCreationTimestamp="2025-12-03 07:14:32 +0000 UTC" firstStartedPulling="2025-12-03 07:14:33.737821128 +0000 UTC m=+1758.542719462" lastFinishedPulling="2025-12-03 07:14:34.312130533 +0000 UTC m=+1759.117028868" observedRunningTime="2025-12-03 07:14:34.966409672 +0000 UTC m=+1759.771308006" watchObservedRunningTime="2025-12-03 07:15:00.135004816 +0000 UTC m=+1784.939903150" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.136859 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs"] Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.137831 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.140072 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.146162 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.147290 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs"] Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.328161 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-config-volume\") pod \"collect-profiles-29412435-rzbxs\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.328211 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-secret-volume\") pod \"collect-profiles-29412435-rzbxs\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.328406 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk7fl\" (UniqueName: \"kubernetes.io/projected/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-kube-api-access-dk7fl\") pod \"collect-profiles-29412435-rzbxs\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.430382 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk7fl\" (UniqueName: \"kubernetes.io/projected/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-kube-api-access-dk7fl\") pod \"collect-profiles-29412435-rzbxs\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.430584 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-config-volume\") pod \"collect-profiles-29412435-rzbxs\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.430613 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-secret-volume\") pod \"collect-profiles-29412435-rzbxs\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.431304 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-config-volume\") pod \"collect-profiles-29412435-rzbxs\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.436714 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-secret-volume\") pod \"collect-profiles-29412435-rzbxs\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.443729 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk7fl\" (UniqueName: \"kubernetes.io/projected/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-kube-api-access-dk7fl\") pod \"collect-profiles-29412435-rzbxs\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.455785 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:00 crc kubenswrapper[4475]: I1203 07:15:00.830784 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs"] Dec 03 07:15:01 crc kubenswrapper[4475]: I1203 07:15:01.097481 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" event={"ID":"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a","Type":"ContainerStarted","Data":"87e3d0ebb402d584bab645096ae1ecfa5c68480559f38d9f1b94649a4623759f"} Dec 03 07:15:01 crc kubenswrapper[4475]: I1203 07:15:01.097962 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" event={"ID":"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a","Type":"ContainerStarted","Data":"b2ed84262f8cce04eb6303ab1e8696478c8251e65b20c24bd1d3ed2211ea2377"} Dec 03 07:15:01 crc kubenswrapper[4475]: I1203 07:15:01.122011 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" podStartSLOduration=1.121996849 podStartE2EDuration="1.121996849s" podCreationTimestamp="2025-12-03 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:15:01.118858798 +0000 UTC m=+1785.923757132" watchObservedRunningTime="2025-12-03 07:15:01.121996849 +0000 UTC m=+1785.926895184" Dec 03 07:15:02 crc kubenswrapper[4475]: I1203 07:15:02.103860 4475 generic.go:334] "Generic (PLEG): container finished" podID="fa7323af-2ea5-42bf-8a4b-2ddebc8d768a" containerID="87e3d0ebb402d584bab645096ae1ecfa5c68480559f38d9f1b94649a4623759f" exitCode=0 Dec 03 07:15:02 crc kubenswrapper[4475]: I1203 07:15:02.103905 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" event={"ID":"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a","Type":"ContainerDied","Data":"87e3d0ebb402d584bab645096ae1ecfa5c68480559f38d9f1b94649a4623759f"} Dec 03 07:15:03 crc kubenswrapper[4475]: I1203 07:15:03.365896 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:03 crc kubenswrapper[4475]: I1203 07:15:03.478820 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk7fl\" (UniqueName: \"kubernetes.io/projected/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-kube-api-access-dk7fl\") pod \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " Dec 03 07:15:03 crc kubenswrapper[4475]: I1203 07:15:03.478978 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-secret-volume\") pod \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " Dec 03 07:15:03 crc kubenswrapper[4475]: I1203 07:15:03.479023 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-config-volume\") pod \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\" (UID: \"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a\") " Dec 03 07:15:03 crc kubenswrapper[4475]: I1203 07:15:03.479588 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa7323af-2ea5-42bf-8a4b-2ddebc8d768a" (UID: "fa7323af-2ea5-42bf-8a4b-2ddebc8d768a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:15:03 crc kubenswrapper[4475]: I1203 07:15:03.479874 4475 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:03 crc kubenswrapper[4475]: I1203 07:15:03.483426 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa7323af-2ea5-42bf-8a4b-2ddebc8d768a" (UID: "fa7323af-2ea5-42bf-8a4b-2ddebc8d768a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:15:03 crc kubenswrapper[4475]: I1203 07:15:03.484976 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-kube-api-access-dk7fl" (OuterVolumeSpecName: "kube-api-access-dk7fl") pod "fa7323af-2ea5-42bf-8a4b-2ddebc8d768a" (UID: "fa7323af-2ea5-42bf-8a4b-2ddebc8d768a"). InnerVolumeSpecName "kube-api-access-dk7fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:15:03 crc kubenswrapper[4475]: I1203 07:15:03.581610 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk7fl\" (UniqueName: \"kubernetes.io/projected/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-kube-api-access-dk7fl\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:03 crc kubenswrapper[4475]: I1203 07:15:03.581740 4475 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:04 crc kubenswrapper[4475]: I1203 07:15:04.118161 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" event={"ID":"fa7323af-2ea5-42bf-8a4b-2ddebc8d768a","Type":"ContainerDied","Data":"b2ed84262f8cce04eb6303ab1e8696478c8251e65b20c24bd1d3ed2211ea2377"} Dec 03 07:15:04 crc kubenswrapper[4475]: I1203 07:15:04.118373 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2ed84262f8cce04eb6303ab1e8696478c8251e65b20c24bd1d3ed2211ea2377" Dec 03 07:15:04 crc kubenswrapper[4475]: I1203 07:15:04.118238 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs" Dec 03 07:15:12 crc kubenswrapper[4475]: I1203 07:15:12.491759 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:15:12 crc kubenswrapper[4475]: E1203 07:15:12.492136 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:15:19 crc kubenswrapper[4475]: I1203 07:15:19.215312 4475 generic.go:334] "Generic (PLEG): container finished" podID="ea23edf8-c790-42f0-80cd-b62fdd382faa" containerID="4d65ec2e9f1db710d7ebaa89138bc99573fd15e2e1f668ef8550b96880c9021f" exitCode=0 Dec 03 07:15:19 crc kubenswrapper[4475]: I1203 07:15:19.215383 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" event={"ID":"ea23edf8-c790-42f0-80cd-b62fdd382faa","Type":"ContainerDied","Data":"4d65ec2e9f1db710d7ebaa89138bc99573fd15e2e1f668ef8550b96880c9021f"} Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.525627 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.686018 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovncontroller-config-0\") pod \"ea23edf8-c790-42f0-80cd-b62fdd382faa\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.686140 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-inventory\") pod \"ea23edf8-c790-42f0-80cd-b62fdd382faa\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.686257 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ssh-key\") pod \"ea23edf8-c790-42f0-80cd-b62fdd382faa\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.686281 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnbgd\" (UniqueName: \"kubernetes.io/projected/ea23edf8-c790-42f0-80cd-b62fdd382faa-kube-api-access-qnbgd\") pod \"ea23edf8-c790-42f0-80cd-b62fdd382faa\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.686351 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovn-combined-ca-bundle\") pod \"ea23edf8-c790-42f0-80cd-b62fdd382faa\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.690406 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea23edf8-c790-42f0-80cd-b62fdd382faa-kube-api-access-qnbgd" (OuterVolumeSpecName: "kube-api-access-qnbgd") pod "ea23edf8-c790-42f0-80cd-b62fdd382faa" (UID: "ea23edf8-c790-42f0-80cd-b62fdd382faa"). InnerVolumeSpecName "kube-api-access-qnbgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.690830 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ea23edf8-c790-42f0-80cd-b62fdd382faa" (UID: "ea23edf8-c790-42f0-80cd-b62fdd382faa"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.707442 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-inventory" (OuterVolumeSpecName: "inventory") pod "ea23edf8-c790-42f0-80cd-b62fdd382faa" (UID: "ea23edf8-c790-42f0-80cd-b62fdd382faa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:15:20 crc kubenswrapper[4475]: E1203 07:15:20.707545 4475 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovncontroller-config-0 podName:ea23edf8-c790-42f0-80cd-b62fdd382faa nodeName:}" failed. No retries permitted until 2025-12-03 07:15:21.206667189 +0000 UTC m=+1806.011565543 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovncontroller-config-0" (UniqueName: "kubernetes.io/configmap/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovncontroller-config-0") pod "ea23edf8-c790-42f0-80cd-b62fdd382faa" (UID: "ea23edf8-c790-42f0-80cd-b62fdd382faa") : error deleting /var/lib/kubelet/pods/ea23edf8-c790-42f0-80cd-b62fdd382faa/volume-subpaths: remove /var/lib/kubelet/pods/ea23edf8-c790-42f0-80cd-b62fdd382faa/volume-subpaths: no such file or directory Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.708202 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea23edf8-c790-42f0-80cd-b62fdd382faa" (UID: "ea23edf8-c790-42f0-80cd-b62fdd382faa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.788151 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.788191 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.788200 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnbgd\" (UniqueName: \"kubernetes.io/projected/ea23edf8-c790-42f0-80cd-b62fdd382faa-kube-api-access-qnbgd\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:20 crc kubenswrapper[4475]: I1203 07:15:20.788211 4475 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.230418 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" event={"ID":"ea23edf8-c790-42f0-80cd-b62fdd382faa","Type":"ContainerDied","Data":"24b92e464f9a000d3c527f8638f124dbdae3898d5e07cbc972ec037cd15f1235"} Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.230469 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b92e464f9a000d3c527f8638f124dbdae3898d5e07cbc972ec037cd15f1235" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.230480 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-f9z7p" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.297018 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h"] Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.297087 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovncontroller-config-0\") pod \"ea23edf8-c790-42f0-80cd-b62fdd382faa\" (UID: \"ea23edf8-c790-42f0-80cd-b62fdd382faa\") " Dec 03 07:15:21 crc kubenswrapper[4475]: E1203 07:15:21.297348 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea23edf8-c790-42f0-80cd-b62fdd382faa" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.297364 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea23edf8-c790-42f0-80cd-b62fdd382faa" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 07:15:21 crc kubenswrapper[4475]: E1203 07:15:21.297384 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7323af-2ea5-42bf-8a4b-2ddebc8d768a" containerName="collect-profiles" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.297390 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7323af-2ea5-42bf-8a4b-2ddebc8d768a" containerName="collect-profiles" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.297574 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7323af-2ea5-42bf-8a4b-2ddebc8d768a" containerName="collect-profiles" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.297590 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea23edf8-c790-42f0-80cd-b62fdd382faa" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.298098 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.298222 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ea23edf8-c790-42f0-80cd-b62fdd382faa" (UID: "ea23edf8-c790-42f0-80cd-b62fdd382faa"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.303773 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.303820 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.306216 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h"] Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.399719 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.399950 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.400056 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.400137 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2d4b\" (UniqueName: \"kubernetes.io/projected/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-kube-api-access-d2d4b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.400238 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.400413 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.400586 4475 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ea23edf8-c790-42f0-80cd-b62fdd382faa-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.501729 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.501832 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.501851 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.501892 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.501911 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2d4b\" (UniqueName: \"kubernetes.io/projected/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-kube-api-access-d2d4b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.501935 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.505034 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.505320 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.505434 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.506618 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.509039 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.516672 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2d4b\" (UniqueName: \"kubernetes.io/projected/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-kube-api-access-d2d4b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:21 crc kubenswrapper[4475]: I1203 07:15:21.614841 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:22 crc kubenswrapper[4475]: I1203 07:15:22.046495 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h"] Dec 03 07:15:22 crc kubenswrapper[4475]: I1203 07:15:22.065145 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:15:22 crc kubenswrapper[4475]: I1203 07:15:22.237283 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" event={"ID":"c31c6e8e-876d-4e95-9932-7aaa0b3c405f","Type":"ContainerStarted","Data":"7bdb6cc6990aa5cfc337670252ffabe637150cdfd338d35f75bd475286d0f786"} Dec 03 07:15:23 crc kubenswrapper[4475]: I1203 07:15:23.245811 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" event={"ID":"c31c6e8e-876d-4e95-9932-7aaa0b3c405f","Type":"ContainerStarted","Data":"f7c2c2aa864529b005eea58abb210c3bc5ae5524ebb0d7aab181a0aa07e3bc82"} Dec 03 07:15:23 crc kubenswrapper[4475]: I1203 07:15:23.264270 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" podStartSLOduration=1.783206298 podStartE2EDuration="2.264257569s" podCreationTimestamp="2025-12-03 07:15:21 +0000 UTC" firstStartedPulling="2025-12-03 07:15:22.064930046 +0000 UTC m=+1806.869828371" lastFinishedPulling="2025-12-03 07:15:22.545981308 +0000 UTC m=+1807.350879642" observedRunningTime="2025-12-03 07:15:23.258921934 +0000 UTC m=+1808.063820267" watchObservedRunningTime="2025-12-03 07:15:23.264257569 +0000 UTC m=+1808.069155902" Dec 03 07:15:25 crc kubenswrapper[4475]: I1203 07:15:25.495758 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:15:25 crc kubenswrapper[4475]: E1203 07:15:25.496195 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:15:38 crc kubenswrapper[4475]: I1203 07:15:38.491532 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:15:38 crc kubenswrapper[4475]: E1203 07:15:38.492130 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:15:52 crc kubenswrapper[4475]: I1203 07:15:52.491800 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:15:52 crc kubenswrapper[4475]: E1203 07:15:52.492373 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:15:55 crc kubenswrapper[4475]: I1203 07:15:55.449094 4475 generic.go:334] "Generic (PLEG): container finished" podID="c31c6e8e-876d-4e95-9932-7aaa0b3c405f" containerID="f7c2c2aa864529b005eea58abb210c3bc5ae5524ebb0d7aab181a0aa07e3bc82" exitCode=0 Dec 03 07:15:55 crc kubenswrapper[4475]: I1203 07:15:55.449177 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" event={"ID":"c31c6e8e-876d-4e95-9932-7aaa0b3c405f","Type":"ContainerDied","Data":"f7c2c2aa864529b005eea58abb210c3bc5ae5524ebb0d7aab181a0aa07e3bc82"} Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.735928 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.850810 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-metadata-combined-ca-bundle\") pod \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.850888 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-ssh-key\") pod \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.850923 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-inventory\") pod \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.850943 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-nova-metadata-neutron-config-0\") pod \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.851060 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.851102 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2d4b\" (UniqueName: \"kubernetes.io/projected/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-kube-api-access-d2d4b\") pod \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\" (UID: \"c31c6e8e-876d-4e95-9932-7aaa0b3c405f\") " Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.855408 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c31c6e8e-876d-4e95-9932-7aaa0b3c405f" (UID: "c31c6e8e-876d-4e95-9932-7aaa0b3c405f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.857033 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-kube-api-access-d2d4b" (OuterVolumeSpecName: "kube-api-access-d2d4b") pod "c31c6e8e-876d-4e95-9932-7aaa0b3c405f" (UID: "c31c6e8e-876d-4e95-9932-7aaa0b3c405f"). InnerVolumeSpecName "kube-api-access-d2d4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.871974 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-inventory" (OuterVolumeSpecName: "inventory") pod "c31c6e8e-876d-4e95-9932-7aaa0b3c405f" (UID: "c31c6e8e-876d-4e95-9932-7aaa0b3c405f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.872008 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c31c6e8e-876d-4e95-9932-7aaa0b3c405f" (UID: "c31c6e8e-876d-4e95-9932-7aaa0b3c405f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.872536 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c31c6e8e-876d-4e95-9932-7aaa0b3c405f" (UID: "c31c6e8e-876d-4e95-9932-7aaa0b3c405f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.873041 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c31c6e8e-876d-4e95-9932-7aaa0b3c405f" (UID: "c31c6e8e-876d-4e95-9932-7aaa0b3c405f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.952937 4475 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.952959 4475 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.952971 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2d4b\" (UniqueName: \"kubernetes.io/projected/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-kube-api-access-d2d4b\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.952984 4475 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.952992 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:56 crc kubenswrapper[4475]: I1203 07:15:56.952999 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c31c6e8e-876d-4e95-9932-7aaa0b3c405f-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.462309 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" event={"ID":"c31c6e8e-876d-4e95-9932-7aaa0b3c405f","Type":"ContainerDied","Data":"7bdb6cc6990aa5cfc337670252ffabe637150cdfd338d35f75bd475286d0f786"} Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.462340 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bdb6cc6990aa5cfc337670252ffabe637150cdfd338d35f75bd475286d0f786" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.462355 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wrx4h" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.535544 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp"] Dec 03 07:15:57 crc kubenswrapper[4475]: E1203 07:15:57.535843 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c31c6e8e-876d-4e95-9932-7aaa0b3c405f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.535859 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31c6e8e-876d-4e95-9932-7aaa0b3c405f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.536040 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="c31c6e8e-876d-4e95-9932-7aaa0b3c405f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.536651 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.543367 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.543381 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.543525 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.543598 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.543823 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.543840 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp"] Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.563379 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z67sh\" (UniqueName: \"kubernetes.io/projected/4cfb7f1e-95d0-4374-85e1-0f0600ede996-kube-api-access-z67sh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.563427 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.563527 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.563557 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.563586 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.664888 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z67sh\" (UniqueName: \"kubernetes.io/projected/4cfb7f1e-95d0-4374-85e1-0f0600ede996-kube-api-access-z67sh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.664940 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.664982 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.665006 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.665032 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.668444 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.668613 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.668746 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.670096 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.680630 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z67sh\" (UniqueName: \"kubernetes.io/projected/4cfb7f1e-95d0-4374-85e1-0f0600ede996-kube-api-access-z67sh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:57 crc kubenswrapper[4475]: I1203 07:15:57.853335 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:15:58 crc kubenswrapper[4475]: I1203 07:15:58.278262 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp"] Dec 03 07:15:58 crc kubenswrapper[4475]: I1203 07:15:58.469813 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" event={"ID":"4cfb7f1e-95d0-4374-85e1-0f0600ede996","Type":"ContainerStarted","Data":"a106b9e136d8d075f92c0aa512ec4fd3df238f5133e08b79b19897db63b96de9"} Dec 03 07:15:59 crc kubenswrapper[4475]: I1203 07:15:59.476446 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" event={"ID":"4cfb7f1e-95d0-4374-85e1-0f0600ede996","Type":"ContainerStarted","Data":"06c8d16683bd14d0fcb87781ae84ec392987cd89a35a7d9598950aa2c05ec98b"} Dec 03 07:16:07 crc kubenswrapper[4475]: I1203 07:16:07.491895 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:16:07 crc kubenswrapper[4475]: E1203 07:16:07.492629 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:16:19 crc kubenswrapper[4475]: I1203 07:16:19.491347 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:16:19 crc kubenswrapper[4475]: E1203 07:16:19.492548 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:16:34 crc kubenswrapper[4475]: I1203 07:16:34.491261 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:16:34 crc kubenswrapper[4475]: E1203 07:16:34.492386 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:16:49 crc kubenswrapper[4475]: I1203 07:16:49.491036 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:16:49 crc kubenswrapper[4475]: E1203 07:16:49.491611 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:17:02 crc kubenswrapper[4475]: I1203 07:17:02.491386 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:17:02 crc kubenswrapper[4475]: E1203 07:17:02.491958 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:17:15 crc kubenswrapper[4475]: I1203 07:17:15.495635 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:17:15 crc kubenswrapper[4475]: E1203 07:17:15.496178 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:17:28 crc kubenswrapper[4475]: I1203 07:17:28.491023 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:17:28 crc kubenswrapper[4475]: E1203 07:17:28.491555 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:17:42 crc kubenswrapper[4475]: I1203 07:17:42.491280 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:17:42 crc kubenswrapper[4475]: E1203 07:17:42.491897 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:17:56 crc kubenswrapper[4475]: I1203 07:17:56.490925 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:17:56 crc kubenswrapper[4475]: E1203 07:17:56.491415 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:18:11 crc kubenswrapper[4475]: I1203 07:18:11.491531 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:18:11 crc kubenswrapper[4475]: E1203 07:18:11.493911 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:18:23 crc kubenswrapper[4475]: I1203 07:18:23.491643 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:18:23 crc kubenswrapper[4475]: E1203 07:18:23.492228 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:18:35 crc kubenswrapper[4475]: I1203 07:18:35.495915 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:18:36 crc kubenswrapper[4475]: I1203 07:18:36.442538 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"25465579b0091f19d7fac6131d69c8b26cb89f7f3ec3a2030f1c445fd208db5e"} Dec 03 07:18:36 crc kubenswrapper[4475]: I1203 07:18:36.463308 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" podStartSLOduration=158.956198584 podStartE2EDuration="2m39.463292435s" podCreationTimestamp="2025-12-03 07:15:57 +0000 UTC" firstStartedPulling="2025-12-03 07:15:58.287922423 +0000 UTC m=+1843.092820758" lastFinishedPulling="2025-12-03 07:15:58.795016275 +0000 UTC m=+1843.599914609" observedRunningTime="2025-12-03 07:15:59.493682764 +0000 UTC m=+1844.298581098" watchObservedRunningTime="2025-12-03 07:18:36.463292435 +0000 UTC m=+2001.268190769" Dec 03 07:19:05 crc kubenswrapper[4475]: I1203 07:19:05.622395 4475 generic.go:334] "Generic (PLEG): container finished" podID="4cfb7f1e-95d0-4374-85e1-0f0600ede996" containerID="06c8d16683bd14d0fcb87781ae84ec392987cd89a35a7d9598950aa2c05ec98b" exitCode=0 Dec 03 07:19:05 crc kubenswrapper[4475]: I1203 07:19:05.622495 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" event={"ID":"4cfb7f1e-95d0-4374-85e1-0f0600ede996","Type":"ContainerDied","Data":"06c8d16683bd14d0fcb87781ae84ec392987cd89a35a7d9598950aa2c05ec98b"} Dec 03 07:19:06 crc kubenswrapper[4475]: I1203 07:19:06.914431 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.097227 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-combined-ca-bundle\") pod \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.097267 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-inventory\") pod \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.097353 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-ssh-key\") pod \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.097483 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z67sh\" (UniqueName: \"kubernetes.io/projected/4cfb7f1e-95d0-4374-85e1-0f0600ede996-kube-api-access-z67sh\") pod \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.097549 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-secret-0\") pod \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\" (UID: \"4cfb7f1e-95d0-4374-85e1-0f0600ede996\") " Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.101381 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4cfb7f1e-95d0-4374-85e1-0f0600ede996" (UID: "4cfb7f1e-95d0-4374-85e1-0f0600ede996"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.103119 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cfb7f1e-95d0-4374-85e1-0f0600ede996-kube-api-access-z67sh" (OuterVolumeSpecName: "kube-api-access-z67sh") pod "4cfb7f1e-95d0-4374-85e1-0f0600ede996" (UID: "4cfb7f1e-95d0-4374-85e1-0f0600ede996"). InnerVolumeSpecName "kube-api-access-z67sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.119260 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4cfb7f1e-95d0-4374-85e1-0f0600ede996" (UID: "4cfb7f1e-95d0-4374-85e1-0f0600ede996"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.119822 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-inventory" (OuterVolumeSpecName: "inventory") pod "4cfb7f1e-95d0-4374-85e1-0f0600ede996" (UID: "4cfb7f1e-95d0-4374-85e1-0f0600ede996"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.123204 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "4cfb7f1e-95d0-4374-85e1-0f0600ede996" (UID: "4cfb7f1e-95d0-4374-85e1-0f0600ede996"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.199670 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.199697 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z67sh\" (UniqueName: \"kubernetes.io/projected/4cfb7f1e-95d0-4374-85e1-0f0600ede996-kube-api-access-z67sh\") on node \"crc\" DevicePath \"\"" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.199707 4475 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.199716 4475 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.199726 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cfb7f1e-95d0-4374-85e1-0f0600ede996-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.635636 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" event={"ID":"4cfb7f1e-95d0-4374-85e1-0f0600ede996","Type":"ContainerDied","Data":"a106b9e136d8d075f92c0aa512ec4fd3df238f5133e08b79b19897db63b96de9"} Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.635669 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a106b9e136d8d075f92c0aa512ec4fd3df238f5133e08b79b19897db63b96de9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.635679 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bk6fp" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.719379 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9"] Dec 03 07:19:07 crc kubenswrapper[4475]: E1203 07:19:07.719916 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfb7f1e-95d0-4374-85e1-0f0600ede996" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.719934 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfb7f1e-95d0-4374-85e1-0f0600ede996" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.720142 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfb7f1e-95d0-4374-85e1-0f0600ede996" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.720687 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.722423 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.722874 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.723060 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.723217 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.723327 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.723517 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.723641 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.727097 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9"] Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.807956 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.808001 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.808026 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.808059 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.808127 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.808167 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.808202 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq2l4\" (UniqueName: \"kubernetes.io/projected/a4f8ac49-019c-45c9-a059-603686bff1a7-kube-api-access-jq2l4\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.808244 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.808272 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.910546 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.910647 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.910696 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.910742 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq2l4\" (UniqueName: \"kubernetes.io/projected/a4f8ac49-019c-45c9-a059-603686bff1a7-kube-api-access-jq2l4\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.910779 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.910819 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.910854 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.910888 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.910919 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.911759 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.914125 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.914221 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.914415 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.914612 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.914980 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.915586 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.915795 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:07 crc kubenswrapper[4475]: I1203 07:19:07.927112 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq2l4\" (UniqueName: \"kubernetes.io/projected/a4f8ac49-019c-45c9-a059-603686bff1a7-kube-api-access-jq2l4\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n6rg9\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:08 crc kubenswrapper[4475]: I1203 07:19:08.032390 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:19:08 crc kubenswrapper[4475]: I1203 07:19:08.454308 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9"] Dec 03 07:19:08 crc kubenswrapper[4475]: I1203 07:19:08.643147 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" event={"ID":"a4f8ac49-019c-45c9-a059-603686bff1a7","Type":"ContainerStarted","Data":"9396105f4c5f07299b63cee1d356427626339371c69d9187e6f209255dc13e6d"} Dec 03 07:19:09 crc kubenswrapper[4475]: I1203 07:19:09.650667 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" event={"ID":"a4f8ac49-019c-45c9-a059-603686bff1a7","Type":"ContainerStarted","Data":"cd88c041e34c2837f189ff1277a381728913c53887a2764b1ca74f9ff23512fe"} Dec 03 07:19:09 crc kubenswrapper[4475]: I1203 07:19:09.665585 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" podStartSLOduration=2.082968075 podStartE2EDuration="2.665573471s" podCreationTimestamp="2025-12-03 07:19:07 +0000 UTC" firstStartedPulling="2025-12-03 07:19:08.455024944 +0000 UTC m=+2033.259923278" lastFinishedPulling="2025-12-03 07:19:09.037630339 +0000 UTC m=+2033.842528674" observedRunningTime="2025-12-03 07:19:09.663751735 +0000 UTC m=+2034.468650068" watchObservedRunningTime="2025-12-03 07:19:09.665573471 +0000 UTC m=+2034.470471805" Dec 03 07:20:58 crc kubenswrapper[4475]: I1203 07:20:58.933888 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:20:58 crc kubenswrapper[4475]: I1203 07:20:58.934219 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:21:06 crc kubenswrapper[4475]: I1203 07:21:06.485446 4475 generic.go:334] "Generic (PLEG): container finished" podID="a4f8ac49-019c-45c9-a059-603686bff1a7" containerID="cd88c041e34c2837f189ff1277a381728913c53887a2764b1ca74f9ff23512fe" exitCode=0 Dec 03 07:21:06 crc kubenswrapper[4475]: I1203 07:21:06.485514 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" event={"ID":"a4f8ac49-019c-45c9-a059-603686bff1a7","Type":"ContainerDied","Data":"cd88c041e34c2837f189ff1277a381728913c53887a2764b1ca74f9ff23512fe"} Dec 03 07:21:07 crc kubenswrapper[4475]: I1203 07:21:07.850257 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.045530 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-0\") pod \"a4f8ac49-019c-45c9-a059-603686bff1a7\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.045616 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-1\") pod \"a4f8ac49-019c-45c9-a059-603686bff1a7\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.045636 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq2l4\" (UniqueName: \"kubernetes.io/projected/a4f8ac49-019c-45c9-a059-603686bff1a7-kube-api-access-jq2l4\") pod \"a4f8ac49-019c-45c9-a059-603686bff1a7\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.045657 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-1\") pod \"a4f8ac49-019c-45c9-a059-603686bff1a7\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.045684 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-0\") pod \"a4f8ac49-019c-45c9-a059-603686bff1a7\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.045702 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-combined-ca-bundle\") pod \"a4f8ac49-019c-45c9-a059-603686bff1a7\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.045724 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-extra-config-0\") pod \"a4f8ac49-019c-45c9-a059-603686bff1a7\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.045765 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-ssh-key\") pod \"a4f8ac49-019c-45c9-a059-603686bff1a7\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.045783 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-inventory\") pod \"a4f8ac49-019c-45c9-a059-603686bff1a7\" (UID: \"a4f8ac49-019c-45c9-a059-603686bff1a7\") " Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.051779 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f8ac49-019c-45c9-a059-603686bff1a7-kube-api-access-jq2l4" (OuterVolumeSpecName: "kube-api-access-jq2l4") pod "a4f8ac49-019c-45c9-a059-603686bff1a7" (UID: "a4f8ac49-019c-45c9-a059-603686bff1a7"). InnerVolumeSpecName "kube-api-access-jq2l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.053129 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a4f8ac49-019c-45c9-a059-603686bff1a7" (UID: "a4f8ac49-019c-45c9-a059-603686bff1a7"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.073733 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a4f8ac49-019c-45c9-a059-603686bff1a7" (UID: "a4f8ac49-019c-45c9-a059-603686bff1a7"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.075754 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a4f8ac49-019c-45c9-a059-603686bff1a7" (UID: "a4f8ac49-019c-45c9-a059-603686bff1a7"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.076797 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-inventory" (OuterVolumeSpecName: "inventory") pod "a4f8ac49-019c-45c9-a059-603686bff1a7" (UID: "a4f8ac49-019c-45c9-a059-603686bff1a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.078935 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a4f8ac49-019c-45c9-a059-603686bff1a7" (UID: "a4f8ac49-019c-45c9-a059-603686bff1a7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.085790 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "a4f8ac49-019c-45c9-a059-603686bff1a7" (UID: "a4f8ac49-019c-45c9-a059-603686bff1a7"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.093998 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a4f8ac49-019c-45c9-a059-603686bff1a7" (UID: "a4f8ac49-019c-45c9-a059-603686bff1a7"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.096223 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a4f8ac49-019c-45c9-a059-603686bff1a7" (UID: "a4f8ac49-019c-45c9-a059-603686bff1a7"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.148243 4475 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.148382 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq2l4\" (UniqueName: \"kubernetes.io/projected/a4f8ac49-019c-45c9-a059-603686bff1a7-kube-api-access-jq2l4\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.148502 4475 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.148624 4475 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.148709 4475 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.148761 4475 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.148835 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.148913 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.148982 4475 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a4f8ac49-019c-45c9-a059-603686bff1a7-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.500889 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" event={"ID":"a4f8ac49-019c-45c9-a059-603686bff1a7","Type":"ContainerDied","Data":"9396105f4c5f07299b63cee1d356427626339371c69d9187e6f209255dc13e6d"} Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.500932 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9396105f4c5f07299b63cee1d356427626339371c69d9187e6f209255dc13e6d" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.500975 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n6rg9" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.580285 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc"] Dec 03 07:21:08 crc kubenswrapper[4475]: E1203 07:21:08.581016 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f8ac49-019c-45c9-a059-603686bff1a7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.581034 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f8ac49-019c-45c9-a059-603686bff1a7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.581239 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f8ac49-019c-45c9-a059-603686bff1a7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.581840 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.584049 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.584516 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.584667 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gxv6j" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.584794 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.596544 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc"] Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.596634 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.656266 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.656350 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.656414 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.656430 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz9wk\" (UniqueName: \"kubernetes.io/projected/054de259-96ce-4771-9e25-c1f170b93160-kube-api-access-bz9wk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.656534 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.656622 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.656739 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.758722 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.758778 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.758833 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.758873 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.758895 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.758935 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.758952 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz9wk\" (UniqueName: \"kubernetes.io/projected/054de259-96ce-4771-9e25-c1f170b93160-kube-api-access-bz9wk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.762167 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.762571 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.762665 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.763108 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.763347 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.763517 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.773751 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz9wk\" (UniqueName: \"kubernetes.io/projected/054de259-96ce-4771-9e25-c1f170b93160-kube-api-access-bz9wk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:08 crc kubenswrapper[4475]: I1203 07:21:08.904036 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:21:09 crc kubenswrapper[4475]: I1203 07:21:09.354164 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc"] Dec 03 07:21:09 crc kubenswrapper[4475]: I1203 07:21:09.362024 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:21:09 crc kubenswrapper[4475]: I1203 07:21:09.506636 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" event={"ID":"054de259-96ce-4771-9e25-c1f170b93160","Type":"ContainerStarted","Data":"8ab49dc1c39d274d6db113ef193fc8137c35235919763bc4f4946fe67d53a8c3"} Dec 03 07:21:10 crc kubenswrapper[4475]: I1203 07:21:10.514272 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" event={"ID":"054de259-96ce-4771-9e25-c1f170b93160","Type":"ContainerStarted","Data":"8c5c909c0f1efe8a7a3ee271ccbe3896d0bb08b125ce1b03046ef522c2b94f76"} Dec 03 07:21:11 crc kubenswrapper[4475]: I1203 07:21:11.762158 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" podStartSLOduration=3.097184097 podStartE2EDuration="3.762143444s" podCreationTimestamp="2025-12-03 07:21:08 +0000 UTC" firstStartedPulling="2025-12-03 07:21:09.361835176 +0000 UTC m=+2154.166733510" lastFinishedPulling="2025-12-03 07:21:10.026794523 +0000 UTC m=+2154.831692857" observedRunningTime="2025-12-03 07:21:10.530124538 +0000 UTC m=+2155.335022872" watchObservedRunningTime="2025-12-03 07:21:11.762143444 +0000 UTC m=+2156.567041778" Dec 03 07:21:11 crc kubenswrapper[4475]: I1203 07:21:11.764543 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q82ks"] Dec 03 07:21:11 crc kubenswrapper[4475]: I1203 07:21:11.766248 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:11 crc kubenswrapper[4475]: I1203 07:21:11.777941 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q82ks"] Dec 03 07:21:11 crc kubenswrapper[4475]: I1203 07:21:11.908155 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-catalog-content\") pod \"certified-operators-q82ks\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:11 crc kubenswrapper[4475]: I1203 07:21:11.908244 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnsr7\" (UniqueName: \"kubernetes.io/projected/97ae352b-76e0-4b71-8bcd-f0d68b301648-kube-api-access-pnsr7\") pod \"certified-operators-q82ks\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:11 crc kubenswrapper[4475]: I1203 07:21:11.908290 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-utilities\") pod \"certified-operators-q82ks\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:12 crc kubenswrapper[4475]: I1203 07:21:12.010258 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnsr7\" (UniqueName: \"kubernetes.io/projected/97ae352b-76e0-4b71-8bcd-f0d68b301648-kube-api-access-pnsr7\") pod \"certified-operators-q82ks\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:12 crc kubenswrapper[4475]: I1203 07:21:12.010324 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-utilities\") pod \"certified-operators-q82ks\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:12 crc kubenswrapper[4475]: I1203 07:21:12.010409 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-catalog-content\") pod \"certified-operators-q82ks\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:12 crc kubenswrapper[4475]: I1203 07:21:12.010862 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-utilities\") pod \"certified-operators-q82ks\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:12 crc kubenswrapper[4475]: I1203 07:21:12.010907 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-catalog-content\") pod \"certified-operators-q82ks\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:12 crc kubenswrapper[4475]: I1203 07:21:12.040572 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnsr7\" (UniqueName: \"kubernetes.io/projected/97ae352b-76e0-4b71-8bcd-f0d68b301648-kube-api-access-pnsr7\") pod \"certified-operators-q82ks\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:12 crc kubenswrapper[4475]: I1203 07:21:12.080024 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:12 crc kubenswrapper[4475]: I1203 07:21:12.577652 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q82ks"] Dec 03 07:21:12 crc kubenswrapper[4475]: W1203 07:21:12.586271 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ae352b_76e0_4b71_8bcd_f0d68b301648.slice/crio-5ca1d9ffef315ecb0db14f6e01c0c4672c7a9062bb06e44fa241ce6d7cbbd730 WatchSource:0}: Error finding container 5ca1d9ffef315ecb0db14f6e01c0c4672c7a9062bb06e44fa241ce6d7cbbd730: Status 404 returned error can't find the container with id 5ca1d9ffef315ecb0db14f6e01c0c4672c7a9062bb06e44fa241ce6d7cbbd730 Dec 03 07:21:13 crc kubenswrapper[4475]: I1203 07:21:13.535571 4475 generic.go:334] "Generic (PLEG): container finished" podID="97ae352b-76e0-4b71-8bcd-f0d68b301648" containerID="5ff931c6154ea697479a09b74e5f961af1e25d0479925178a7c2766da8acb4d1" exitCode=0 Dec 03 07:21:13 crc kubenswrapper[4475]: I1203 07:21:13.535609 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q82ks" event={"ID":"97ae352b-76e0-4b71-8bcd-f0d68b301648","Type":"ContainerDied","Data":"5ff931c6154ea697479a09b74e5f961af1e25d0479925178a7c2766da8acb4d1"} Dec 03 07:21:13 crc kubenswrapper[4475]: I1203 07:21:13.536048 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q82ks" event={"ID":"97ae352b-76e0-4b71-8bcd-f0d68b301648","Type":"ContainerStarted","Data":"5ca1d9ffef315ecb0db14f6e01c0c4672c7a9062bb06e44fa241ce6d7cbbd730"} Dec 03 07:21:14 crc kubenswrapper[4475]: I1203 07:21:14.543778 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q82ks" event={"ID":"97ae352b-76e0-4b71-8bcd-f0d68b301648","Type":"ContainerStarted","Data":"697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f"} Dec 03 07:21:15 crc kubenswrapper[4475]: I1203 07:21:15.550920 4475 generic.go:334] "Generic (PLEG): container finished" podID="97ae352b-76e0-4b71-8bcd-f0d68b301648" containerID="697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f" exitCode=0 Dec 03 07:21:15 crc kubenswrapper[4475]: I1203 07:21:15.550998 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q82ks" event={"ID":"97ae352b-76e0-4b71-8bcd-f0d68b301648","Type":"ContainerDied","Data":"697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f"} Dec 03 07:21:16 crc kubenswrapper[4475]: I1203 07:21:16.559050 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q82ks" event={"ID":"97ae352b-76e0-4b71-8bcd-f0d68b301648","Type":"ContainerStarted","Data":"bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd"} Dec 03 07:21:16 crc kubenswrapper[4475]: I1203 07:21:16.581773 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q82ks" podStartSLOduration=3.0889703490000002 podStartE2EDuration="5.581759012s" podCreationTimestamp="2025-12-03 07:21:11 +0000 UTC" firstStartedPulling="2025-12-03 07:21:13.537338123 +0000 UTC m=+2158.342236457" lastFinishedPulling="2025-12-03 07:21:16.030126786 +0000 UTC m=+2160.835025120" observedRunningTime="2025-12-03 07:21:16.576931222 +0000 UTC m=+2161.381829556" watchObservedRunningTime="2025-12-03 07:21:16.581759012 +0000 UTC m=+2161.386657345" Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.423419 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-76gx9"] Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.425392 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.443007 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-76gx9"] Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.533030 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wrrc\" (UniqueName: \"kubernetes.io/projected/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-kube-api-access-4wrrc\") pod \"community-operators-76gx9\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.533240 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-utilities\") pod \"community-operators-76gx9\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.533276 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-catalog-content\") pod \"community-operators-76gx9\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.635049 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wrrc\" (UniqueName: \"kubernetes.io/projected/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-kube-api-access-4wrrc\") pod \"community-operators-76gx9\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.635230 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-utilities\") pod \"community-operators-76gx9\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.635262 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-catalog-content\") pod \"community-operators-76gx9\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.636736 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-catalog-content\") pod \"community-operators-76gx9\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.636941 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-utilities\") pod \"community-operators-76gx9\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.655045 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wrrc\" (UniqueName: \"kubernetes.io/projected/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-kube-api-access-4wrrc\") pod \"community-operators-76gx9\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:19 crc kubenswrapper[4475]: I1203 07:21:19.755296 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:20 crc kubenswrapper[4475]: W1203 07:21:20.211432 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cfdfbde_f6bc_478b_9ea1_0478caebb8f5.slice/crio-3676cfdc8f03dea3bd3725c11055f676e16fb3cc64b5f4beebf26050b4faa4da WatchSource:0}: Error finding container 3676cfdc8f03dea3bd3725c11055f676e16fb3cc64b5f4beebf26050b4faa4da: Status 404 returned error can't find the container with id 3676cfdc8f03dea3bd3725c11055f676e16fb3cc64b5f4beebf26050b4faa4da Dec 03 07:21:20 crc kubenswrapper[4475]: I1203 07:21:20.212996 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-76gx9"] Dec 03 07:21:20 crc kubenswrapper[4475]: I1203 07:21:20.584870 4475 generic.go:334] "Generic (PLEG): container finished" podID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" containerID="2b998c72628dc1b1c78ce12088a45bc3d08129a0afc168fde4761ebfce35bfe8" exitCode=0 Dec 03 07:21:20 crc kubenswrapper[4475]: I1203 07:21:20.584965 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76gx9" event={"ID":"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5","Type":"ContainerDied","Data":"2b998c72628dc1b1c78ce12088a45bc3d08129a0afc168fde4761ebfce35bfe8"} Dec 03 07:21:20 crc kubenswrapper[4475]: I1203 07:21:20.585705 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76gx9" event={"ID":"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5","Type":"ContainerStarted","Data":"3676cfdc8f03dea3bd3725c11055f676e16fb3cc64b5f4beebf26050b4faa4da"} Dec 03 07:21:21 crc kubenswrapper[4475]: I1203 07:21:21.592770 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76gx9" event={"ID":"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5","Type":"ContainerStarted","Data":"a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5"} Dec 03 07:21:22 crc kubenswrapper[4475]: I1203 07:21:22.080184 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:22 crc kubenswrapper[4475]: I1203 07:21:22.080290 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:22 crc kubenswrapper[4475]: I1203 07:21:22.152262 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:22 crc kubenswrapper[4475]: I1203 07:21:22.611038 4475 generic.go:334] "Generic (PLEG): container finished" podID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" containerID="a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5" exitCode=0 Dec 03 07:21:22 crc kubenswrapper[4475]: I1203 07:21:22.611142 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76gx9" event={"ID":"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5","Type":"ContainerDied","Data":"a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5"} Dec 03 07:21:22 crc kubenswrapper[4475]: I1203 07:21:22.648328 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:23 crc kubenswrapper[4475]: I1203 07:21:23.619018 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76gx9" event={"ID":"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5","Type":"ContainerStarted","Data":"552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d"} Dec 03 07:21:23 crc kubenswrapper[4475]: I1203 07:21:23.637226 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-76gx9" podStartSLOduration=2.123214714 podStartE2EDuration="4.637212494s" podCreationTimestamp="2025-12-03 07:21:19 +0000 UTC" firstStartedPulling="2025-12-03 07:21:20.586179679 +0000 UTC m=+2165.391078014" lastFinishedPulling="2025-12-03 07:21:23.10017746 +0000 UTC m=+2167.905075794" observedRunningTime="2025-12-03 07:21:23.633187223 +0000 UTC m=+2168.438085557" watchObservedRunningTime="2025-12-03 07:21:23.637212494 +0000 UTC m=+2168.442110828" Dec 03 07:21:24 crc kubenswrapper[4475]: I1203 07:21:24.549845 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q82ks"] Dec 03 07:21:25 crc kubenswrapper[4475]: I1203 07:21:25.632435 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q82ks" podUID="97ae352b-76e0-4b71-8bcd-f0d68b301648" containerName="registry-server" containerID="cri-o://bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd" gracePeriod=2 Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.028791 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.165813 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnsr7\" (UniqueName: \"kubernetes.io/projected/97ae352b-76e0-4b71-8bcd-f0d68b301648-kube-api-access-pnsr7\") pod \"97ae352b-76e0-4b71-8bcd-f0d68b301648\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.165943 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-catalog-content\") pod \"97ae352b-76e0-4b71-8bcd-f0d68b301648\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.169519 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-utilities\") pod \"97ae352b-76e0-4b71-8bcd-f0d68b301648\" (UID: \"97ae352b-76e0-4b71-8bcd-f0d68b301648\") " Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.176841 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ae352b-76e0-4b71-8bcd-f0d68b301648-kube-api-access-pnsr7" (OuterVolumeSpecName: "kube-api-access-pnsr7") pod "97ae352b-76e0-4b71-8bcd-f0d68b301648" (UID: "97ae352b-76e0-4b71-8bcd-f0d68b301648"). InnerVolumeSpecName "kube-api-access-pnsr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.176945 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-utilities" (OuterVolumeSpecName: "utilities") pod "97ae352b-76e0-4b71-8bcd-f0d68b301648" (UID: "97ae352b-76e0-4b71-8bcd-f0d68b301648"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.201739 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97ae352b-76e0-4b71-8bcd-f0d68b301648" (UID: "97ae352b-76e0-4b71-8bcd-f0d68b301648"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.274462 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.274723 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97ae352b-76e0-4b71-8bcd-f0d68b301648-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.274781 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnsr7\" (UniqueName: \"kubernetes.io/projected/97ae352b-76e0-4b71-8bcd-f0d68b301648-kube-api-access-pnsr7\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.638940 4475 generic.go:334] "Generic (PLEG): container finished" podID="97ae352b-76e0-4b71-8bcd-f0d68b301648" containerID="bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd" exitCode=0 Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.638974 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q82ks" event={"ID":"97ae352b-76e0-4b71-8bcd-f0d68b301648","Type":"ContainerDied","Data":"bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd"} Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.638997 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q82ks" event={"ID":"97ae352b-76e0-4b71-8bcd-f0d68b301648","Type":"ContainerDied","Data":"5ca1d9ffef315ecb0db14f6e01c0c4672c7a9062bb06e44fa241ce6d7cbbd730"} Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.639012 4475 scope.go:117] "RemoveContainer" containerID="bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.639014 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q82ks" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.654621 4475 scope.go:117] "RemoveContainer" containerID="697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.676592 4475 scope.go:117] "RemoveContainer" containerID="5ff931c6154ea697479a09b74e5f961af1e25d0479925178a7c2766da8acb4d1" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.682551 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q82ks"] Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.688960 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q82ks"] Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.707579 4475 scope.go:117] "RemoveContainer" containerID="bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd" Dec 03 07:21:26 crc kubenswrapper[4475]: E1203 07:21:26.707928 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd\": container with ID starting with bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd not found: ID does not exist" containerID="bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.707969 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd"} err="failed to get container status \"bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd\": rpc error: code = NotFound desc = could not find container \"bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd\": container with ID starting with bf496a6641e7560e3f33913c9f1238adade2ca6161f9baf250c344e12481e4fd not found: ID does not exist" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.707993 4475 scope.go:117] "RemoveContainer" containerID="697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f" Dec 03 07:21:26 crc kubenswrapper[4475]: E1203 07:21:26.708286 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f\": container with ID starting with 697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f not found: ID does not exist" containerID="697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.708306 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f"} err="failed to get container status \"697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f\": rpc error: code = NotFound desc = could not find container \"697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f\": container with ID starting with 697552274eff6cb34f564b213d9d3d4324670e8ca91188a9e2e262fe9296058f not found: ID does not exist" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.708319 4475 scope.go:117] "RemoveContainer" containerID="5ff931c6154ea697479a09b74e5f961af1e25d0479925178a7c2766da8acb4d1" Dec 03 07:21:26 crc kubenswrapper[4475]: E1203 07:21:26.708607 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff931c6154ea697479a09b74e5f961af1e25d0479925178a7c2766da8acb4d1\": container with ID starting with 5ff931c6154ea697479a09b74e5f961af1e25d0479925178a7c2766da8acb4d1 not found: ID does not exist" containerID="5ff931c6154ea697479a09b74e5f961af1e25d0479925178a7c2766da8acb4d1" Dec 03 07:21:26 crc kubenswrapper[4475]: I1203 07:21:26.708625 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff931c6154ea697479a09b74e5f961af1e25d0479925178a7c2766da8acb4d1"} err="failed to get container status \"5ff931c6154ea697479a09b74e5f961af1e25d0479925178a7c2766da8acb4d1\": rpc error: code = NotFound desc = could not find container \"5ff931c6154ea697479a09b74e5f961af1e25d0479925178a7c2766da8acb4d1\": container with ID starting with 5ff931c6154ea697479a09b74e5f961af1e25d0479925178a7c2766da8acb4d1 not found: ID does not exist" Dec 03 07:21:27 crc kubenswrapper[4475]: I1203 07:21:27.499597 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ae352b-76e0-4b71-8bcd-f0d68b301648" path="/var/lib/kubelet/pods/97ae352b-76e0-4b71-8bcd-f0d68b301648/volumes" Dec 03 07:21:28 crc kubenswrapper[4475]: I1203 07:21:28.933095 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:21:28 crc kubenswrapper[4475]: I1203 07:21:28.933347 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:21:29 crc kubenswrapper[4475]: I1203 07:21:29.756538 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:29 crc kubenswrapper[4475]: I1203 07:21:29.756797 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:29 crc kubenswrapper[4475]: I1203 07:21:29.789411 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:30 crc kubenswrapper[4475]: I1203 07:21:30.698889 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:30 crc kubenswrapper[4475]: I1203 07:21:30.736863 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-76gx9"] Dec 03 07:21:32 crc kubenswrapper[4475]: I1203 07:21:32.677503 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-76gx9" podUID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" containerName="registry-server" containerID="cri-o://552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d" gracePeriod=2 Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.025482 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.082550 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-utilities\") pod \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.082604 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-catalog-content\") pod \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.082652 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wrrc\" (UniqueName: \"kubernetes.io/projected/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-kube-api-access-4wrrc\") pod \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\" (UID: \"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5\") " Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.084115 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-utilities" (OuterVolumeSpecName: "utilities") pod "2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" (UID: "2cfdfbde-f6bc-478b-9ea1-0478caebb8f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.088396 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-kube-api-access-4wrrc" (OuterVolumeSpecName: "kube-api-access-4wrrc") pod "2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" (UID: "2cfdfbde-f6bc-478b-9ea1-0478caebb8f5"). InnerVolumeSpecName "kube-api-access-4wrrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.123265 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" (UID: "2cfdfbde-f6bc-478b-9ea1-0478caebb8f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.184421 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.184447 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.184473 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wrrc\" (UniqueName: \"kubernetes.io/projected/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5-kube-api-access-4wrrc\") on node \"crc\" DevicePath \"\"" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.686880 4475 generic.go:334] "Generic (PLEG): container finished" podID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" containerID="552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d" exitCode=0 Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.686940 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76gx9" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.686973 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76gx9" event={"ID":"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5","Type":"ContainerDied","Data":"552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d"} Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.687740 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76gx9" event={"ID":"2cfdfbde-f6bc-478b-9ea1-0478caebb8f5","Type":"ContainerDied","Data":"3676cfdc8f03dea3bd3725c11055f676e16fb3cc64b5f4beebf26050b4faa4da"} Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.687781 4475 scope.go:117] "RemoveContainer" containerID="552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.704948 4475 scope.go:117] "RemoveContainer" containerID="a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.705967 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-76gx9"] Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.718214 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-76gx9"] Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.727253 4475 scope.go:117] "RemoveContainer" containerID="2b998c72628dc1b1c78ce12088a45bc3d08129a0afc168fde4761ebfce35bfe8" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.751869 4475 scope.go:117] "RemoveContainer" containerID="552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d" Dec 03 07:21:33 crc kubenswrapper[4475]: E1203 07:21:33.752168 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d\": container with ID starting with 552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d not found: ID does not exist" containerID="552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.752260 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d"} err="failed to get container status \"552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d\": rpc error: code = NotFound desc = could not find container \"552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d\": container with ID starting with 552f1cc494ddf09b0b6c6149164c01194120f89638a3dbd3400d840b70b7b02d not found: ID does not exist" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.752333 4475 scope.go:117] "RemoveContainer" containerID="a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5" Dec 03 07:21:33 crc kubenswrapper[4475]: E1203 07:21:33.752684 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5\": container with ID starting with a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5 not found: ID does not exist" containerID="a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.752709 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5"} err="failed to get container status \"a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5\": rpc error: code = NotFound desc = could not find container \"a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5\": container with ID starting with a1c1478270cc19bc0d5ee3195c7c615453137dbe321df2e178a195f9b31ae8b5 not found: ID does not exist" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.752729 4475 scope.go:117] "RemoveContainer" containerID="2b998c72628dc1b1c78ce12088a45bc3d08129a0afc168fde4761ebfce35bfe8" Dec 03 07:21:33 crc kubenswrapper[4475]: E1203 07:21:33.753048 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b998c72628dc1b1c78ce12088a45bc3d08129a0afc168fde4761ebfce35bfe8\": container with ID starting with 2b998c72628dc1b1c78ce12088a45bc3d08129a0afc168fde4761ebfce35bfe8 not found: ID does not exist" containerID="2b998c72628dc1b1c78ce12088a45bc3d08129a0afc168fde4761ebfce35bfe8" Dec 03 07:21:33 crc kubenswrapper[4475]: I1203 07:21:33.753087 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b998c72628dc1b1c78ce12088a45bc3d08129a0afc168fde4761ebfce35bfe8"} err="failed to get container status \"2b998c72628dc1b1c78ce12088a45bc3d08129a0afc168fde4761ebfce35bfe8\": rpc error: code = NotFound desc = could not find container \"2b998c72628dc1b1c78ce12088a45bc3d08129a0afc168fde4761ebfce35bfe8\": container with ID starting with 2b998c72628dc1b1c78ce12088a45bc3d08129a0afc168fde4761ebfce35bfe8 not found: ID does not exist" Dec 03 07:21:35 crc kubenswrapper[4475]: I1203 07:21:35.499357 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" path="/var/lib/kubelet/pods/2cfdfbde-f6bc-478b-9ea1-0478caebb8f5/volumes" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.192242 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6qhll"] Dec 03 07:21:58 crc kubenswrapper[4475]: E1203 07:21:58.192934 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ae352b-76e0-4b71-8bcd-f0d68b301648" containerName="extract-utilities" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.192947 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ae352b-76e0-4b71-8bcd-f0d68b301648" containerName="extract-utilities" Dec 03 07:21:58 crc kubenswrapper[4475]: E1203 07:21:58.192966 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" containerName="extract-content" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.192972 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" containerName="extract-content" Dec 03 07:21:58 crc kubenswrapper[4475]: E1203 07:21:58.192981 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ae352b-76e0-4b71-8bcd-f0d68b301648" containerName="extract-content" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.192987 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ae352b-76e0-4b71-8bcd-f0d68b301648" containerName="extract-content" Dec 03 07:21:58 crc kubenswrapper[4475]: E1203 07:21:58.192998 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" containerName="extract-utilities" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.193003 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" containerName="extract-utilities" Dec 03 07:21:58 crc kubenswrapper[4475]: E1203 07:21:58.193017 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" containerName="registry-server" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.193022 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" containerName="registry-server" Dec 03 07:21:58 crc kubenswrapper[4475]: E1203 07:21:58.193032 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ae352b-76e0-4b71-8bcd-f0d68b301648" containerName="registry-server" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.193037 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ae352b-76e0-4b71-8bcd-f0d68b301648" containerName="registry-server" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.193186 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cfdfbde-f6bc-478b-9ea1-0478caebb8f5" containerName="registry-server" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.193206 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ae352b-76e0-4b71-8bcd-f0d68b301648" containerName="registry-server" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.198136 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.200137 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qhll"] Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.319727 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8cws\" (UniqueName: \"kubernetes.io/projected/f7a9d772-392f-4089-a080-f8591ef6e96b-kube-api-access-t8cws\") pod \"redhat-operators-6qhll\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.320510 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-catalog-content\") pod \"redhat-operators-6qhll\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.320746 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-utilities\") pod \"redhat-operators-6qhll\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.422543 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8cws\" (UniqueName: \"kubernetes.io/projected/f7a9d772-392f-4089-a080-f8591ef6e96b-kube-api-access-t8cws\") pod \"redhat-operators-6qhll\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.422674 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-catalog-content\") pod \"redhat-operators-6qhll\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.422807 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-utilities\") pod \"redhat-operators-6qhll\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.423200 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-utilities\") pod \"redhat-operators-6qhll\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.423292 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-catalog-content\") pod \"redhat-operators-6qhll\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.439135 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8cws\" (UniqueName: \"kubernetes.io/projected/f7a9d772-392f-4089-a080-f8591ef6e96b-kube-api-access-t8cws\") pod \"redhat-operators-6qhll\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.513321 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.933103 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.933357 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.933407 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.933969 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25465579b0091f19d7fac6131d69c8b26cb89f7f3ec3a2030f1c445fd208db5e"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.934025 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://25465579b0091f19d7fac6131d69c8b26cb89f7f3ec3a2030f1c445fd208db5e" gracePeriod=600 Dec 03 07:21:58 crc kubenswrapper[4475]: I1203 07:21:58.956124 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qhll"] Dec 03 07:21:59 crc kubenswrapper[4475]: I1203 07:21:59.863381 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="25465579b0091f19d7fac6131d69c8b26cb89f7f3ec3a2030f1c445fd208db5e" exitCode=0 Dec 03 07:21:59 crc kubenswrapper[4475]: I1203 07:21:59.863480 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"25465579b0091f19d7fac6131d69c8b26cb89f7f3ec3a2030f1c445fd208db5e"} Dec 03 07:21:59 crc kubenswrapper[4475]: I1203 07:21:59.863765 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f"} Dec 03 07:21:59 crc kubenswrapper[4475]: I1203 07:21:59.863783 4475 scope.go:117] "RemoveContainer" containerID="8c0d1f1df6ca180fa9eee37943bf61d5a3966b5f7a3ed0b6213a7a47c187d104" Dec 03 07:21:59 crc kubenswrapper[4475]: I1203 07:21:59.865318 4475 generic.go:334] "Generic (PLEG): container finished" podID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerID="6f832a615bc34d8d6c3fbf645e24f3a043c71a76bc3432afb2f0672c0643cfbe" exitCode=0 Dec 03 07:21:59 crc kubenswrapper[4475]: I1203 07:21:59.865340 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qhll" event={"ID":"f7a9d772-392f-4089-a080-f8591ef6e96b","Type":"ContainerDied","Data":"6f832a615bc34d8d6c3fbf645e24f3a043c71a76bc3432afb2f0672c0643cfbe"} Dec 03 07:21:59 crc kubenswrapper[4475]: I1203 07:21:59.865354 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qhll" event={"ID":"f7a9d772-392f-4089-a080-f8591ef6e96b","Type":"ContainerStarted","Data":"4de1e68497355eb8c5fbf07c2d3e6e6f6d8fbb305142fd773c9e9f0310fb4e91"} Dec 03 07:22:00 crc kubenswrapper[4475]: I1203 07:22:00.877247 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qhll" event={"ID":"f7a9d772-392f-4089-a080-f8591ef6e96b","Type":"ContainerStarted","Data":"f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad"} Dec 03 07:22:02 crc kubenswrapper[4475]: I1203 07:22:02.913792 4475 generic.go:334] "Generic (PLEG): container finished" podID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerID="f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad" exitCode=0 Dec 03 07:22:02 crc kubenswrapper[4475]: I1203 07:22:02.913862 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qhll" event={"ID":"f7a9d772-392f-4089-a080-f8591ef6e96b","Type":"ContainerDied","Data":"f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad"} Dec 03 07:22:03 crc kubenswrapper[4475]: I1203 07:22:03.922048 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qhll" event={"ID":"f7a9d772-392f-4089-a080-f8591ef6e96b","Type":"ContainerStarted","Data":"f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a"} Dec 03 07:22:03 crc kubenswrapper[4475]: I1203 07:22:03.939853 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6qhll" podStartSLOduration=2.3628603200000002 podStartE2EDuration="5.939840776s" podCreationTimestamp="2025-12-03 07:21:58 +0000 UTC" firstStartedPulling="2025-12-03 07:21:59.866869197 +0000 UTC m=+2204.671767531" lastFinishedPulling="2025-12-03 07:22:03.443849653 +0000 UTC m=+2208.248747987" observedRunningTime="2025-12-03 07:22:03.936110221 +0000 UTC m=+2208.741008555" watchObservedRunningTime="2025-12-03 07:22:03.939840776 +0000 UTC m=+2208.744739101" Dec 03 07:22:08 crc kubenswrapper[4475]: I1203 07:22:08.513930 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:22:08 crc kubenswrapper[4475]: I1203 07:22:08.514153 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:22:09 crc kubenswrapper[4475]: I1203 07:22:09.544526 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6qhll" podUID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerName="registry-server" probeResult="failure" output=< Dec 03 07:22:09 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 07:22:09 crc kubenswrapper[4475]: > Dec 03 07:22:18 crc kubenswrapper[4475]: I1203 07:22:18.546269 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:22:18 crc kubenswrapper[4475]: I1203 07:22:18.585649 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:22:18 crc kubenswrapper[4475]: I1203 07:22:18.774503 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6qhll"] Dec 03 07:22:20 crc kubenswrapper[4475]: I1203 07:22:20.023197 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6qhll" podUID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerName="registry-server" containerID="cri-o://f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a" gracePeriod=2 Dec 03 07:22:20 crc kubenswrapper[4475]: I1203 07:22:20.389954 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:22:20 crc kubenswrapper[4475]: I1203 07:22:20.404833 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8cws\" (UniqueName: \"kubernetes.io/projected/f7a9d772-392f-4089-a080-f8591ef6e96b-kube-api-access-t8cws\") pod \"f7a9d772-392f-4089-a080-f8591ef6e96b\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " Dec 03 07:22:20 crc kubenswrapper[4475]: I1203 07:22:20.404883 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-catalog-content\") pod \"f7a9d772-392f-4089-a080-f8591ef6e96b\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " Dec 03 07:22:20 crc kubenswrapper[4475]: I1203 07:22:20.404963 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-utilities\") pod \"f7a9d772-392f-4089-a080-f8591ef6e96b\" (UID: \"f7a9d772-392f-4089-a080-f8591ef6e96b\") " Dec 03 07:22:20 crc kubenswrapper[4475]: I1203 07:22:20.411983 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-utilities" (OuterVolumeSpecName: "utilities") pod "f7a9d772-392f-4089-a080-f8591ef6e96b" (UID: "f7a9d772-392f-4089-a080-f8591ef6e96b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:22:20 crc kubenswrapper[4475]: I1203 07:22:20.415187 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a9d772-392f-4089-a080-f8591ef6e96b-kube-api-access-t8cws" (OuterVolumeSpecName: "kube-api-access-t8cws") pod "f7a9d772-392f-4089-a080-f8591ef6e96b" (UID: "f7a9d772-392f-4089-a080-f8591ef6e96b"). InnerVolumeSpecName "kube-api-access-t8cws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:22:20 crc kubenswrapper[4475]: I1203 07:22:20.480987 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7a9d772-392f-4089-a080-f8591ef6e96b" (UID: "f7a9d772-392f-4089-a080-f8591ef6e96b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:22:20 crc kubenswrapper[4475]: I1203 07:22:20.506338 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8cws\" (UniqueName: \"kubernetes.io/projected/f7a9d772-392f-4089-a080-f8591ef6e96b-kube-api-access-t8cws\") on node \"crc\" DevicePath \"\"" Dec 03 07:22:20 crc kubenswrapper[4475]: I1203 07:22:20.506364 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:22:20 crc kubenswrapper[4475]: I1203 07:22:20.506374 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a9d772-392f-4089-a080-f8591ef6e96b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.031483 4475 generic.go:334] "Generic (PLEG): container finished" podID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerID="f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a" exitCode=0 Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.031540 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qhll" event={"ID":"f7a9d772-392f-4089-a080-f8591ef6e96b","Type":"ContainerDied","Data":"f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a"} Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.031718 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qhll" event={"ID":"f7a9d772-392f-4089-a080-f8591ef6e96b","Type":"ContainerDied","Data":"4de1e68497355eb8c5fbf07c2d3e6e6f6d8fbb305142fd773c9e9f0310fb4e91"} Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.031738 4475 scope.go:117] "RemoveContainer" containerID="f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a" Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.031562 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qhll" Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.063392 4475 scope.go:117] "RemoveContainer" containerID="f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad" Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.064554 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6qhll"] Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.071115 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6qhll"] Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.079339 4475 scope.go:117] "RemoveContainer" containerID="6f832a615bc34d8d6c3fbf645e24f3a043c71a76bc3432afb2f0672c0643cfbe" Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.112479 4475 scope.go:117] "RemoveContainer" containerID="f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a" Dec 03 07:22:21 crc kubenswrapper[4475]: E1203 07:22:21.113731 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a\": container with ID starting with f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a not found: ID does not exist" containerID="f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a" Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.113762 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a"} err="failed to get container status \"f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a\": rpc error: code = NotFound desc = could not find container \"f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a\": container with ID starting with f9ee4c347707e086e6944392d81df098ce77ee54c369db5e5c07b15b4c8efb0a not found: ID does not exist" Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.113781 4475 scope.go:117] "RemoveContainer" containerID="f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad" Dec 03 07:22:21 crc kubenswrapper[4475]: E1203 07:22:21.114076 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad\": container with ID starting with f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad not found: ID does not exist" containerID="f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad" Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.114111 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad"} err="failed to get container status \"f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad\": rpc error: code = NotFound desc = could not find container \"f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad\": container with ID starting with f4b321e72dd0e6be2cd0db7fcc26e8ef784d23d1360a885df4580cdaa7519cad not found: ID does not exist" Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.114137 4475 scope.go:117] "RemoveContainer" containerID="6f832a615bc34d8d6c3fbf645e24f3a043c71a76bc3432afb2f0672c0643cfbe" Dec 03 07:22:21 crc kubenswrapper[4475]: E1203 07:22:21.114506 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f832a615bc34d8d6c3fbf645e24f3a043c71a76bc3432afb2f0672c0643cfbe\": container with ID starting with 6f832a615bc34d8d6c3fbf645e24f3a043c71a76bc3432afb2f0672c0643cfbe not found: ID does not exist" containerID="6f832a615bc34d8d6c3fbf645e24f3a043c71a76bc3432afb2f0672c0643cfbe" Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.114529 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f832a615bc34d8d6c3fbf645e24f3a043c71a76bc3432afb2f0672c0643cfbe"} err="failed to get container status \"6f832a615bc34d8d6c3fbf645e24f3a043c71a76bc3432afb2f0672c0643cfbe\": rpc error: code = NotFound desc = could not find container \"6f832a615bc34d8d6c3fbf645e24f3a043c71a76bc3432afb2f0672c0643cfbe\": container with ID starting with 6f832a615bc34d8d6c3fbf645e24f3a043c71a76bc3432afb2f0672c0643cfbe not found: ID does not exist" Dec 03 07:22:21 crc kubenswrapper[4475]: I1203 07:22:21.498991 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a9d772-392f-4089-a080-f8591ef6e96b" path="/var/lib/kubelet/pods/f7a9d772-392f-4089-a080-f8591ef6e96b/volumes" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.028173 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k46w9"] Dec 03 07:23:03 crc kubenswrapper[4475]: E1203 07:23:03.028842 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerName="registry-server" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.028854 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerName="registry-server" Dec 03 07:23:03 crc kubenswrapper[4475]: E1203 07:23:03.028880 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerName="extract-utilities" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.028885 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerName="extract-utilities" Dec 03 07:23:03 crc kubenswrapper[4475]: E1203 07:23:03.028894 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerName="extract-content" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.028899 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerName="extract-content" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.029051 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a9d772-392f-4089-a080-f8591ef6e96b" containerName="registry-server" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.030168 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.041861 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k46w9"] Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.075668 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr4f9\" (UniqueName: \"kubernetes.io/projected/cbb1e4bd-8821-42af-9c72-806a2f603ad5-kube-api-access-rr4f9\") pod \"redhat-marketplace-k46w9\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.075740 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-catalog-content\") pod \"redhat-marketplace-k46w9\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.075845 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-utilities\") pod \"redhat-marketplace-k46w9\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.178694 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr4f9\" (UniqueName: \"kubernetes.io/projected/cbb1e4bd-8821-42af-9c72-806a2f603ad5-kube-api-access-rr4f9\") pod \"redhat-marketplace-k46w9\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.178809 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-catalog-content\") pod \"redhat-marketplace-k46w9\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.178843 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-utilities\") pod \"redhat-marketplace-k46w9\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.179494 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-utilities\") pod \"redhat-marketplace-k46w9\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.179531 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-catalog-content\") pod \"redhat-marketplace-k46w9\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.195052 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr4f9\" (UniqueName: \"kubernetes.io/projected/cbb1e4bd-8821-42af-9c72-806a2f603ad5-kube-api-access-rr4f9\") pod \"redhat-marketplace-k46w9\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.353236 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:03 crc kubenswrapper[4475]: I1203 07:23:03.800524 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k46w9"] Dec 03 07:23:04 crc kubenswrapper[4475]: I1203 07:23:04.303794 4475 generic.go:334] "Generic (PLEG): container finished" podID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" containerID="b19e453eccdbf4d4945d3e329112f51a7af3d0cf913e67755bf3db3263e10c4e" exitCode=0 Dec 03 07:23:04 crc kubenswrapper[4475]: I1203 07:23:04.303833 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k46w9" event={"ID":"cbb1e4bd-8821-42af-9c72-806a2f603ad5","Type":"ContainerDied","Data":"b19e453eccdbf4d4945d3e329112f51a7af3d0cf913e67755bf3db3263e10c4e"} Dec 03 07:23:04 crc kubenswrapper[4475]: I1203 07:23:04.304754 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k46w9" event={"ID":"cbb1e4bd-8821-42af-9c72-806a2f603ad5","Type":"ContainerStarted","Data":"526cb4fb681bf84b800397a94f79186686bc55d29945c0ba060ed92f8d46faf7"} Dec 03 07:23:05 crc kubenswrapper[4475]: I1203 07:23:05.312484 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k46w9" event={"ID":"cbb1e4bd-8821-42af-9c72-806a2f603ad5","Type":"ContainerStarted","Data":"acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6"} Dec 03 07:23:06 crc kubenswrapper[4475]: I1203 07:23:06.319632 4475 generic.go:334] "Generic (PLEG): container finished" podID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" containerID="acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6" exitCode=0 Dec 03 07:23:06 crc kubenswrapper[4475]: I1203 07:23:06.319727 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k46w9" event={"ID":"cbb1e4bd-8821-42af-9c72-806a2f603ad5","Type":"ContainerDied","Data":"acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6"} Dec 03 07:23:07 crc kubenswrapper[4475]: I1203 07:23:07.327185 4475 generic.go:334] "Generic (PLEG): container finished" podID="054de259-96ce-4771-9e25-c1f170b93160" containerID="8c5c909c0f1efe8a7a3ee271ccbe3896d0bb08b125ce1b03046ef522c2b94f76" exitCode=0 Dec 03 07:23:07 crc kubenswrapper[4475]: I1203 07:23:07.327273 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" event={"ID":"054de259-96ce-4771-9e25-c1f170b93160","Type":"ContainerDied","Data":"8c5c909c0f1efe8a7a3ee271ccbe3896d0bb08b125ce1b03046ef522c2b94f76"} Dec 03 07:23:07 crc kubenswrapper[4475]: I1203 07:23:07.330003 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k46w9" event={"ID":"cbb1e4bd-8821-42af-9c72-806a2f603ad5","Type":"ContainerStarted","Data":"4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e"} Dec 03 07:23:07 crc kubenswrapper[4475]: I1203 07:23:07.356403 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k46w9" podStartSLOduration=1.67644682 podStartE2EDuration="4.356388508s" podCreationTimestamp="2025-12-03 07:23:03 +0000 UTC" firstStartedPulling="2025-12-03 07:23:04.304901155 +0000 UTC m=+2269.109799489" lastFinishedPulling="2025-12-03 07:23:06.984842844 +0000 UTC m=+2271.789741177" observedRunningTime="2025-12-03 07:23:07.35623024 +0000 UTC m=+2272.161128575" watchObservedRunningTime="2025-12-03 07:23:07.356388508 +0000 UTC m=+2272.161286841" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.656101 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.681326 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-0\") pod \"054de259-96ce-4771-9e25-c1f170b93160\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.681437 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-telemetry-combined-ca-bundle\") pod \"054de259-96ce-4771-9e25-c1f170b93160\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.681497 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-inventory\") pod \"054de259-96ce-4771-9e25-c1f170b93160\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.681636 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-1\") pod \"054de259-96ce-4771-9e25-c1f170b93160\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.681668 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-2\") pod \"054de259-96ce-4771-9e25-c1f170b93160\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.681724 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ssh-key\") pod \"054de259-96ce-4771-9e25-c1f170b93160\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.681753 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz9wk\" (UniqueName: \"kubernetes.io/projected/054de259-96ce-4771-9e25-c1f170b93160-kube-api-access-bz9wk\") pod \"054de259-96ce-4771-9e25-c1f170b93160\" (UID: \"054de259-96ce-4771-9e25-c1f170b93160\") " Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.685963 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "054de259-96ce-4771-9e25-c1f170b93160" (UID: "054de259-96ce-4771-9e25-c1f170b93160"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.686420 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/054de259-96ce-4771-9e25-c1f170b93160-kube-api-access-bz9wk" (OuterVolumeSpecName: "kube-api-access-bz9wk") pod "054de259-96ce-4771-9e25-c1f170b93160" (UID: "054de259-96ce-4771-9e25-c1f170b93160"). InnerVolumeSpecName "kube-api-access-bz9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.703767 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-inventory" (OuterVolumeSpecName: "inventory") pod "054de259-96ce-4771-9e25-c1f170b93160" (UID: "054de259-96ce-4771-9e25-c1f170b93160"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.704015 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "054de259-96ce-4771-9e25-c1f170b93160" (UID: "054de259-96ce-4771-9e25-c1f170b93160"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.705745 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "054de259-96ce-4771-9e25-c1f170b93160" (UID: "054de259-96ce-4771-9e25-c1f170b93160"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.705882 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "054de259-96ce-4771-9e25-c1f170b93160" (UID: "054de259-96ce-4771-9e25-c1f170b93160"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.711104 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "054de259-96ce-4771-9e25-c1f170b93160" (UID: "054de259-96ce-4771-9e25-c1f170b93160"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.783102 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.783130 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz9wk\" (UniqueName: \"kubernetes.io/projected/054de259-96ce-4771-9e25-c1f170b93160-kube-api-access-bz9wk\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.783142 4475 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.783151 4475 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.783160 4475 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.783168 4475 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:08 crc kubenswrapper[4475]: I1203 07:23:08.783177 4475 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/054de259-96ce-4771-9e25-c1f170b93160-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:09 crc kubenswrapper[4475]: I1203 07:23:09.344621 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" event={"ID":"054de259-96ce-4771-9e25-c1f170b93160","Type":"ContainerDied","Data":"8ab49dc1c39d274d6db113ef193fc8137c35235919763bc4f4946fe67d53a8c3"} Dec 03 07:23:09 crc kubenswrapper[4475]: I1203 07:23:09.344661 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ab49dc1c39d274d6db113ef193fc8137c35235919763bc4f4946fe67d53a8c3" Dec 03 07:23:09 crc kubenswrapper[4475]: I1203 07:23:09.344702 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qnpdc" Dec 03 07:23:13 crc kubenswrapper[4475]: I1203 07:23:13.354239 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:13 crc kubenswrapper[4475]: I1203 07:23:13.354649 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:13 crc kubenswrapper[4475]: I1203 07:23:13.388787 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:13 crc kubenswrapper[4475]: I1203 07:23:13.427549 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:13 crc kubenswrapper[4475]: I1203 07:23:13.632963 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k46w9"] Dec 03 07:23:15 crc kubenswrapper[4475]: I1203 07:23:15.383405 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k46w9" podUID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" containerName="registry-server" containerID="cri-o://4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e" gracePeriod=2 Dec 03 07:23:15 crc kubenswrapper[4475]: I1203 07:23:15.741306 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:15 crc kubenswrapper[4475]: I1203 07:23:15.784706 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr4f9\" (UniqueName: \"kubernetes.io/projected/cbb1e4bd-8821-42af-9c72-806a2f603ad5-kube-api-access-rr4f9\") pod \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " Dec 03 07:23:15 crc kubenswrapper[4475]: I1203 07:23:15.785292 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-utilities\") pod \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " Dec 03 07:23:15 crc kubenswrapper[4475]: I1203 07:23:15.785778 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-catalog-content\") pod \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\" (UID: \"cbb1e4bd-8821-42af-9c72-806a2f603ad5\") " Dec 03 07:23:15 crc kubenswrapper[4475]: I1203 07:23:15.788682 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-utilities" (OuterVolumeSpecName: "utilities") pod "cbb1e4bd-8821-42af-9c72-806a2f603ad5" (UID: "cbb1e4bd-8821-42af-9c72-806a2f603ad5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:23:15 crc kubenswrapper[4475]: I1203 07:23:15.798581 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb1e4bd-8821-42af-9c72-806a2f603ad5-kube-api-access-rr4f9" (OuterVolumeSpecName: "kube-api-access-rr4f9") pod "cbb1e4bd-8821-42af-9c72-806a2f603ad5" (UID: "cbb1e4bd-8821-42af-9c72-806a2f603ad5"). InnerVolumeSpecName "kube-api-access-rr4f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:23:15 crc kubenswrapper[4475]: I1203 07:23:15.855692 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbb1e4bd-8821-42af-9c72-806a2f603ad5" (UID: "cbb1e4bd-8821-42af-9c72-806a2f603ad5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:23:15 crc kubenswrapper[4475]: I1203 07:23:15.889942 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr4f9\" (UniqueName: \"kubernetes.io/projected/cbb1e4bd-8821-42af-9c72-806a2f603ad5-kube-api-access-rr4f9\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:15 crc kubenswrapper[4475]: I1203 07:23:15.889962 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:15 crc kubenswrapper[4475]: I1203 07:23:15.889970 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb1e4bd-8821-42af-9c72-806a2f603ad5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.392859 4475 generic.go:334] "Generic (PLEG): container finished" podID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" containerID="4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e" exitCode=0 Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.392897 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k46w9" event={"ID":"cbb1e4bd-8821-42af-9c72-806a2f603ad5","Type":"ContainerDied","Data":"4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e"} Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.392908 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k46w9" Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.392921 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k46w9" event={"ID":"cbb1e4bd-8821-42af-9c72-806a2f603ad5","Type":"ContainerDied","Data":"526cb4fb681bf84b800397a94f79186686bc55d29945c0ba060ed92f8d46faf7"} Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.392937 4475 scope.go:117] "RemoveContainer" containerID="4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e" Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.412621 4475 scope.go:117] "RemoveContainer" containerID="acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6" Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.420295 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k46w9"] Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.427611 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k46w9"] Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.431912 4475 scope.go:117] "RemoveContainer" containerID="b19e453eccdbf4d4945d3e329112f51a7af3d0cf913e67755bf3db3263e10c4e" Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.463897 4475 scope.go:117] "RemoveContainer" containerID="4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e" Dec 03 07:23:16 crc kubenswrapper[4475]: E1203 07:23:16.464420 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e\": container with ID starting with 4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e not found: ID does not exist" containerID="4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e" Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.464485 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e"} err="failed to get container status \"4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e\": rpc error: code = NotFound desc = could not find container \"4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e\": container with ID starting with 4b46707af5a7e3f0215fa3150c12e27364ebb5654ab77c42881200393f004d0e not found: ID does not exist" Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.464503 4475 scope.go:117] "RemoveContainer" containerID="acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6" Dec 03 07:23:16 crc kubenswrapper[4475]: E1203 07:23:16.464895 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6\": container with ID starting with acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6 not found: ID does not exist" containerID="acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6" Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.464923 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6"} err="failed to get container status \"acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6\": rpc error: code = NotFound desc = could not find container \"acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6\": container with ID starting with acde0d3578aeb0226d2b6d2733a655dd16a5327f5de25c9784afb6f51a5fddb6 not found: ID does not exist" Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.464943 4475 scope.go:117] "RemoveContainer" containerID="b19e453eccdbf4d4945d3e329112f51a7af3d0cf913e67755bf3db3263e10c4e" Dec 03 07:23:16 crc kubenswrapper[4475]: E1203 07:23:16.465186 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19e453eccdbf4d4945d3e329112f51a7af3d0cf913e67755bf3db3263e10c4e\": container with ID starting with b19e453eccdbf4d4945d3e329112f51a7af3d0cf913e67755bf3db3263e10c4e not found: ID does not exist" containerID="b19e453eccdbf4d4945d3e329112f51a7af3d0cf913e67755bf3db3263e10c4e" Dec 03 07:23:16 crc kubenswrapper[4475]: I1203 07:23:16.465211 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19e453eccdbf4d4945d3e329112f51a7af3d0cf913e67755bf3db3263e10c4e"} err="failed to get container status \"b19e453eccdbf4d4945d3e329112f51a7af3d0cf913e67755bf3db3263e10c4e\": rpc error: code = NotFound desc = could not find container \"b19e453eccdbf4d4945d3e329112f51a7af3d0cf913e67755bf3db3263e10c4e\": container with ID starting with b19e453eccdbf4d4945d3e329112f51a7af3d0cf913e67755bf3db3263e10c4e not found: ID does not exist" Dec 03 07:23:17 crc kubenswrapper[4475]: I1203 07:23:17.498348 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" path="/var/lib/kubelet/pods/cbb1e4bd-8821-42af-9c72-806a2f603ad5/volumes" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.401359 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Dec 03 07:23:58 crc kubenswrapper[4475]: E1203 07:23:58.402294 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" containerName="registry-server" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.402308 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" containerName="registry-server" Dec 03 07:23:58 crc kubenswrapper[4475]: E1203 07:23:58.402348 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" containerName="extract-content" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.402355 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" containerName="extract-content" Dec 03 07:23:58 crc kubenswrapper[4475]: E1203 07:23:58.402373 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" containerName="extract-utilities" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.402379 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" containerName="extract-utilities" Dec 03 07:23:58 crc kubenswrapper[4475]: E1203 07:23:58.402409 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054de259-96ce-4771-9e25-c1f170b93160" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.402416 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="054de259-96ce-4771-9e25-c1f170b93160" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.404338 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbb1e4bd-8821-42af-9c72-806a2f603ad5" containerName="registry-server" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.404395 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="054de259-96ce-4771-9e25-c1f170b93160" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.407091 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.412661 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.412691 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.412781 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dclp7" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.412807 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.431021 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.525766 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.525803 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.525837 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.525871 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.525897 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.525927 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.525943 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdrl\" (UniqueName: \"kubernetes.io/projected/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-kube-api-access-thdrl\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.525969 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.525990 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.627907 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.628071 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.628175 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.628277 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.628343 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.628351 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdrl\" (UniqueName: \"kubernetes.io/projected/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-kube-api-access-thdrl\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.628498 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.628513 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.628539 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.628678 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.628712 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.628928 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.629284 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.629393 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.632392 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.634527 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.634671 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.643758 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdrl\" (UniqueName: \"kubernetes.io/projected/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-kube-api-access-thdrl\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.652795 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:58 crc kubenswrapper[4475]: I1203 07:23:58.731065 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 07:23:59 crc kubenswrapper[4475]: I1203 07:23:59.160801 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Dec 03 07:23:59 crc kubenswrapper[4475]: I1203 07:23:59.672267 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"6ebcee18-a6ef-4674-aea6-1b33ed3c2224","Type":"ContainerStarted","Data":"27bd8d4f9faeddab077847479debd5ad0b08e85acc45661623c3d44d1f7be12f"} Dec 03 07:24:28 crc kubenswrapper[4475]: I1203 07:24:28.932893 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:24:28 crc kubenswrapper[4475]: I1203 07:24:28.933203 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:24:38 crc kubenswrapper[4475]: E1203 07:24:38.804750 4475 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:65066e8ca260a75886ae57f157049605" Dec 03 07:24:38 crc kubenswrapper[4475]: E1203 07:24:38.806151 4475 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:65066e8ca260a75886ae57f157049605" Dec 03 07:24:38 crc kubenswrapper[4475]: E1203 07:24:38.808738 4475 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:65066e8ca260a75886ae57f157049605,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-thdrl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest-s00-multi-thread-testing_openstack(6ebcee18-a6ef-4674-aea6-1b33ed3c2224): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:24:38 crc kubenswrapper[4475]: E1203 07:24:38.809987 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podUID="6ebcee18-a6ef-4674-aea6-1b33ed3c2224" Dec 03 07:24:38 crc kubenswrapper[4475]: E1203 07:24:38.932891 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:65066e8ca260a75886ae57f157049605\\\"\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podUID="6ebcee18-a6ef-4674-aea6-1b33ed3c2224" Dec 03 07:24:54 crc kubenswrapper[4475]: I1203 07:24:54.151133 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 07:24:55 crc kubenswrapper[4475]: I1203 07:24:55.030978 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"6ebcee18-a6ef-4674-aea6-1b33ed3c2224","Type":"ContainerStarted","Data":"07fd218421b2f2f7ba0f15a17f31ca51d8b329ef859497e8f152006feba299e2"} Dec 03 07:24:55 crc kubenswrapper[4475]: I1203 07:24:55.044664 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podStartSLOduration=3.061109818 podStartE2EDuration="58.044649798s" podCreationTimestamp="2025-12-03 07:23:57 +0000 UTC" firstStartedPulling="2025-12-03 07:23:59.165751937 +0000 UTC m=+2323.970650271" lastFinishedPulling="2025-12-03 07:24:54.149291917 +0000 UTC m=+2378.954190251" observedRunningTime="2025-12-03 07:24:55.041928151 +0000 UTC m=+2379.846826485" watchObservedRunningTime="2025-12-03 07:24:55.044649798 +0000 UTC m=+2379.849548132" Dec 03 07:24:58 crc kubenswrapper[4475]: I1203 07:24:58.933170 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:24:58 crc kubenswrapper[4475]: I1203 07:24:58.934103 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:25:28 crc kubenswrapper[4475]: I1203 07:25:28.933523 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:25:28 crc kubenswrapper[4475]: I1203 07:25:28.933900 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:25:28 crc kubenswrapper[4475]: I1203 07:25:28.933941 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 07:25:28 crc kubenswrapper[4475]: I1203 07:25:28.934474 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:25:28 crc kubenswrapper[4475]: I1203 07:25:28.934520 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" gracePeriod=600 Dec 03 07:25:29 crc kubenswrapper[4475]: E1203 07:25:29.059776 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:25:29 crc kubenswrapper[4475]: I1203 07:25:29.244860 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" exitCode=0 Dec 03 07:25:29 crc kubenswrapper[4475]: I1203 07:25:29.244902 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f"} Dec 03 07:25:29 crc kubenswrapper[4475]: I1203 07:25:29.244933 4475 scope.go:117] "RemoveContainer" containerID="25465579b0091f19d7fac6131d69c8b26cb89f7f3ec3a2030f1c445fd208db5e" Dec 03 07:25:29 crc kubenswrapper[4475]: I1203 07:25:29.245509 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:25:29 crc kubenswrapper[4475]: E1203 07:25:29.245796 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:25:43 crc kubenswrapper[4475]: I1203 07:25:43.492068 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:25:43 crc kubenswrapper[4475]: E1203 07:25:43.492885 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:25:58 crc kubenswrapper[4475]: I1203 07:25:58.491657 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:25:58 crc kubenswrapper[4475]: E1203 07:25:58.492168 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:26:11 crc kubenswrapper[4475]: I1203 07:26:11.491527 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:26:11 crc kubenswrapper[4475]: E1203 07:26:11.492079 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:26:25 crc kubenswrapper[4475]: I1203 07:26:25.496347 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:26:25 crc kubenswrapper[4475]: E1203 07:26:25.496919 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:26:39 crc kubenswrapper[4475]: I1203 07:26:39.491382 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:26:39 crc kubenswrapper[4475]: E1203 07:26:39.491963 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:26:54 crc kubenswrapper[4475]: I1203 07:26:54.493329 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:26:54 crc kubenswrapper[4475]: E1203 07:26:54.494135 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:27:07 crc kubenswrapper[4475]: I1203 07:27:07.491588 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:27:07 crc kubenswrapper[4475]: E1203 07:27:07.492171 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:27:20 crc kubenswrapper[4475]: I1203 07:27:20.491527 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:27:20 crc kubenswrapper[4475]: E1203 07:27:20.492094 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:27:35 crc kubenswrapper[4475]: I1203 07:27:35.496065 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:27:35 crc kubenswrapper[4475]: E1203 07:27:35.496733 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:27:49 crc kubenswrapper[4475]: I1203 07:27:49.490878 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:27:49 crc kubenswrapper[4475]: E1203 07:27:49.491404 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:28:03 crc kubenswrapper[4475]: I1203 07:28:03.491293 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:28:03 crc kubenswrapper[4475]: E1203 07:28:03.491892 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:28:17 crc kubenswrapper[4475]: I1203 07:28:17.491665 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:28:17 crc kubenswrapper[4475]: E1203 07:28:17.492254 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:28:28 crc kubenswrapper[4475]: I1203 07:28:28.490759 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:28:28 crc kubenswrapper[4475]: E1203 07:28:28.491392 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:28:39 crc kubenswrapper[4475]: I1203 07:28:39.491105 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:28:39 crc kubenswrapper[4475]: E1203 07:28:39.492435 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:28:54 crc kubenswrapper[4475]: I1203 07:28:54.491445 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:28:54 crc kubenswrapper[4475]: E1203 07:28:54.492784 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:29:07 crc kubenswrapper[4475]: I1203 07:29:07.491212 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:29:07 crc kubenswrapper[4475]: E1203 07:29:07.492363 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:29:22 crc kubenswrapper[4475]: I1203 07:29:22.491652 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:29:22 crc kubenswrapper[4475]: E1203 07:29:22.492356 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:29:37 crc kubenswrapper[4475]: I1203 07:29:37.491259 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:29:37 crc kubenswrapper[4475]: E1203 07:29:37.491843 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:29:49 crc kubenswrapper[4475]: I1203 07:29:49.491729 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:29:49 crc kubenswrapper[4475]: E1203 07:29:49.492282 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.283914 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk"] Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.292118 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.299200 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.346548 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3095202a-352a-453a-b3e1-b6ecc8c3d661-config-volume\") pod \"collect-profiles-29412450-bf2vk\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.346729 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gplm\" (UniqueName: \"kubernetes.io/projected/3095202a-352a-453a-b3e1-b6ecc8c3d661-kube-api-access-2gplm\") pod \"collect-profiles-29412450-bf2vk\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.346859 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3095202a-352a-453a-b3e1-b6ecc8c3d661-secret-volume\") pod \"collect-profiles-29412450-bf2vk\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.349907 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.389583 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk"] Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.448390 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3095202a-352a-453a-b3e1-b6ecc8c3d661-config-volume\") pod \"collect-profiles-29412450-bf2vk\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.448444 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gplm\" (UniqueName: \"kubernetes.io/projected/3095202a-352a-453a-b3e1-b6ecc8c3d661-kube-api-access-2gplm\") pod \"collect-profiles-29412450-bf2vk\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.448551 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3095202a-352a-453a-b3e1-b6ecc8c3d661-secret-volume\") pod \"collect-profiles-29412450-bf2vk\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.449922 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3095202a-352a-453a-b3e1-b6ecc8c3d661-config-volume\") pod \"collect-profiles-29412450-bf2vk\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.466054 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3095202a-352a-453a-b3e1-b6ecc8c3d661-secret-volume\") pod \"collect-profiles-29412450-bf2vk\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.466940 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gplm\" (UniqueName: \"kubernetes.io/projected/3095202a-352a-453a-b3e1-b6ecc8c3d661-kube-api-access-2gplm\") pod \"collect-profiles-29412450-bf2vk\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:00 crc kubenswrapper[4475]: I1203 07:30:00.609838 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:01 crc kubenswrapper[4475]: I1203 07:30:01.282032 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk"] Dec 03 07:30:02 crc kubenswrapper[4475]: I1203 07:30:02.022097 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" event={"ID":"3095202a-352a-453a-b3e1-b6ecc8c3d661","Type":"ContainerDied","Data":"d9349ff08664ec4b7231ae1904ba88086190af5ba690c9565ea3aaba849993b8"} Dec 03 07:30:02 crc kubenswrapper[4475]: I1203 07:30:02.023562 4475 generic.go:334] "Generic (PLEG): container finished" podID="3095202a-352a-453a-b3e1-b6ecc8c3d661" containerID="d9349ff08664ec4b7231ae1904ba88086190af5ba690c9565ea3aaba849993b8" exitCode=0 Dec 03 07:30:02 crc kubenswrapper[4475]: I1203 07:30:02.023689 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" event={"ID":"3095202a-352a-453a-b3e1-b6ecc8c3d661","Type":"ContainerStarted","Data":"eec7f486c39066d21d5e9545ee99ab851f2590e770ee1e1753c96297392530ef"} Dec 03 07:30:02 crc kubenswrapper[4475]: I1203 07:30:02.491384 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:30:02 crc kubenswrapper[4475]: E1203 07:30:02.491627 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:30:03 crc kubenswrapper[4475]: I1203 07:30:03.461008 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:03 crc kubenswrapper[4475]: I1203 07:30:03.502644 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gplm\" (UniqueName: \"kubernetes.io/projected/3095202a-352a-453a-b3e1-b6ecc8c3d661-kube-api-access-2gplm\") pod \"3095202a-352a-453a-b3e1-b6ecc8c3d661\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " Dec 03 07:30:03 crc kubenswrapper[4475]: I1203 07:30:03.502851 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3095202a-352a-453a-b3e1-b6ecc8c3d661-config-volume\") pod \"3095202a-352a-453a-b3e1-b6ecc8c3d661\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " Dec 03 07:30:03 crc kubenswrapper[4475]: I1203 07:30:03.502969 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3095202a-352a-453a-b3e1-b6ecc8c3d661-secret-volume\") pod \"3095202a-352a-453a-b3e1-b6ecc8c3d661\" (UID: \"3095202a-352a-453a-b3e1-b6ecc8c3d661\") " Dec 03 07:30:03 crc kubenswrapper[4475]: I1203 07:30:03.504012 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3095202a-352a-453a-b3e1-b6ecc8c3d661-config-volume" (OuterVolumeSpecName: "config-volume") pod "3095202a-352a-453a-b3e1-b6ecc8c3d661" (UID: "3095202a-352a-453a-b3e1-b6ecc8c3d661"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:30:03 crc kubenswrapper[4475]: I1203 07:30:03.515690 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3095202a-352a-453a-b3e1-b6ecc8c3d661-kube-api-access-2gplm" (OuterVolumeSpecName: "kube-api-access-2gplm") pod "3095202a-352a-453a-b3e1-b6ecc8c3d661" (UID: "3095202a-352a-453a-b3e1-b6ecc8c3d661"). InnerVolumeSpecName "kube-api-access-2gplm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:30:03 crc kubenswrapper[4475]: I1203 07:30:03.519607 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3095202a-352a-453a-b3e1-b6ecc8c3d661-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3095202a-352a-453a-b3e1-b6ecc8c3d661" (UID: "3095202a-352a-453a-b3e1-b6ecc8c3d661"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:30:03 crc kubenswrapper[4475]: I1203 07:30:03.606246 4475 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3095202a-352a-453a-b3e1-b6ecc8c3d661-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:03 crc kubenswrapper[4475]: I1203 07:30:03.606273 4475 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3095202a-352a-453a-b3e1-b6ecc8c3d661-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:03 crc kubenswrapper[4475]: I1203 07:30:03.606284 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gplm\" (UniqueName: \"kubernetes.io/projected/3095202a-352a-453a-b3e1-b6ecc8c3d661-kube-api-access-2gplm\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:04 crc kubenswrapper[4475]: I1203 07:30:04.038338 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" event={"ID":"3095202a-352a-453a-b3e1-b6ecc8c3d661","Type":"ContainerDied","Data":"eec7f486c39066d21d5e9545ee99ab851f2590e770ee1e1753c96297392530ef"} Dec 03 07:30:04 crc kubenswrapper[4475]: I1203 07:30:04.038697 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk" Dec 03 07:30:04 crc kubenswrapper[4475]: I1203 07:30:04.038922 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eec7f486c39066d21d5e9545ee99ab851f2590e770ee1e1753c96297392530ef" Dec 03 07:30:04 crc kubenswrapper[4475]: I1203 07:30:04.534704 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n"] Dec 03 07:30:04 crc kubenswrapper[4475]: I1203 07:30:04.542690 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-wwr7n"] Dec 03 07:30:05 crc kubenswrapper[4475]: I1203 07:30:05.499085 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83c4eef5-5508-470d-8b7a-b7da9d4706d4" path="/var/lib/kubelet/pods/83c4eef5-5508-470d-8b7a-b7da9d4706d4/volumes" Dec 03 07:30:16 crc kubenswrapper[4475]: I1203 07:30:16.491704 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:30:16 crc kubenswrapper[4475]: E1203 07:30:16.492380 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:30:29 crc kubenswrapper[4475]: I1203 07:30:29.006290 4475 scope.go:117] "RemoveContainer" containerID="64bbe628906ffd7c485ec8cc71ede08aea8875194cb357d96894ab844be9e9f5" Dec 03 07:30:30 crc kubenswrapper[4475]: I1203 07:30:30.491132 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:30:31 crc kubenswrapper[4475]: I1203 07:30:31.226835 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"38011db8441fa258228ddb37b16a7964318696935f6daa20e615955ba9070bde"} Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.755863 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tggp2"] Dec 03 07:31:13 crc kubenswrapper[4475]: E1203 07:31:13.760251 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3095202a-352a-453a-b3e1-b6ecc8c3d661" containerName="collect-profiles" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.760280 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="3095202a-352a-453a-b3e1-b6ecc8c3d661" containerName="collect-profiles" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.760933 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="3095202a-352a-453a-b3e1-b6ecc8c3d661" containerName="collect-profiles" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.762334 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.845084 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tggp2"] Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.846658 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-catalog-content\") pod \"certified-operators-tggp2\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.846699 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-utilities\") pod \"certified-operators-tggp2\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.847549 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lrst\" (UniqueName: \"kubernetes.io/projected/f7006506-f69d-40a4-b64f-701b8c419881-kube-api-access-4lrst\") pod \"certified-operators-tggp2\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.949281 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-catalog-content\") pod \"certified-operators-tggp2\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.949329 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-utilities\") pod \"certified-operators-tggp2\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.949446 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lrst\" (UniqueName: \"kubernetes.io/projected/f7006506-f69d-40a4-b64f-701b8c419881-kube-api-access-4lrst\") pod \"certified-operators-tggp2\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.953061 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-utilities\") pod \"certified-operators-tggp2\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.953622 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-catalog-content\") pod \"certified-operators-tggp2\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:13 crc kubenswrapper[4475]: I1203 07:31:13.981114 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lrst\" (UniqueName: \"kubernetes.io/projected/f7006506-f69d-40a4-b64f-701b8c419881-kube-api-access-4lrst\") pod \"certified-operators-tggp2\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:14 crc kubenswrapper[4475]: I1203 07:31:14.083748 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:15 crc kubenswrapper[4475]: I1203 07:31:15.062859 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tggp2"] Dec 03 07:31:15 crc kubenswrapper[4475]: I1203 07:31:15.508961 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tggp2" event={"ID":"f7006506-f69d-40a4-b64f-701b8c419881","Type":"ContainerDied","Data":"e97af5199d72f25648957743b40ee98b62a50618fa4e2e4f43abafb85decf6f8"} Dec 03 07:31:15 crc kubenswrapper[4475]: I1203 07:31:15.514305 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:31:15 crc kubenswrapper[4475]: I1203 07:31:15.515206 4475 generic.go:334] "Generic (PLEG): container finished" podID="f7006506-f69d-40a4-b64f-701b8c419881" containerID="e97af5199d72f25648957743b40ee98b62a50618fa4e2e4f43abafb85decf6f8" exitCode=0 Dec 03 07:31:15 crc kubenswrapper[4475]: I1203 07:31:15.515647 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tggp2" event={"ID":"f7006506-f69d-40a4-b64f-701b8c419881","Type":"ContainerStarted","Data":"4e8c2df5a9eb5c128dc17bcfb7b7c3dbece94f6d2438d2f003fcdcb08a00bb8f"} Dec 03 07:31:17 crc kubenswrapper[4475]: I1203 07:31:17.528906 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tggp2" event={"ID":"f7006506-f69d-40a4-b64f-701b8c419881","Type":"ContainerStarted","Data":"5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e"} Dec 03 07:31:18 crc kubenswrapper[4475]: I1203 07:31:18.536399 4475 generic.go:334] "Generic (PLEG): container finished" podID="f7006506-f69d-40a4-b64f-701b8c419881" containerID="5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e" exitCode=0 Dec 03 07:31:18 crc kubenswrapper[4475]: I1203 07:31:18.536435 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tggp2" event={"ID":"f7006506-f69d-40a4-b64f-701b8c419881","Type":"ContainerDied","Data":"5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e"} Dec 03 07:31:19 crc kubenswrapper[4475]: I1203 07:31:19.545320 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tggp2" event={"ID":"f7006506-f69d-40a4-b64f-701b8c419881","Type":"ContainerStarted","Data":"f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a"} Dec 03 07:31:19 crc kubenswrapper[4475]: I1203 07:31:19.562202 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tggp2" podStartSLOduration=2.872619213 podStartE2EDuration="6.561464945s" podCreationTimestamp="2025-12-03 07:31:13 +0000 UTC" firstStartedPulling="2025-12-03 07:31:15.50910541 +0000 UTC m=+2760.314003744" lastFinishedPulling="2025-12-03 07:31:19.197951142 +0000 UTC m=+2764.002849476" observedRunningTime="2025-12-03 07:31:19.556643308 +0000 UTC m=+2764.361541642" watchObservedRunningTime="2025-12-03 07:31:19.561464945 +0000 UTC m=+2764.366363278" Dec 03 07:31:24 crc kubenswrapper[4475]: I1203 07:31:24.088693 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:24 crc kubenswrapper[4475]: I1203 07:31:24.089071 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:24 crc kubenswrapper[4475]: I1203 07:31:24.155031 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:24 crc kubenswrapper[4475]: I1203 07:31:24.610615 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:24 crc kubenswrapper[4475]: I1203 07:31:24.665163 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tggp2"] Dec 03 07:31:26 crc kubenswrapper[4475]: I1203 07:31:26.590040 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tggp2" podUID="f7006506-f69d-40a4-b64f-701b8c419881" containerName="registry-server" containerID="cri-o://f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a" gracePeriod=2 Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.338810 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.368021 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-utilities\") pod \"f7006506-f69d-40a4-b64f-701b8c419881\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.368146 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lrst\" (UniqueName: \"kubernetes.io/projected/f7006506-f69d-40a4-b64f-701b8c419881-kube-api-access-4lrst\") pod \"f7006506-f69d-40a4-b64f-701b8c419881\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.368233 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-catalog-content\") pod \"f7006506-f69d-40a4-b64f-701b8c419881\" (UID: \"f7006506-f69d-40a4-b64f-701b8c419881\") " Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.372576 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-utilities" (OuterVolumeSpecName: "utilities") pod "f7006506-f69d-40a4-b64f-701b8c419881" (UID: "f7006506-f69d-40a4-b64f-701b8c419881"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.403591 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7006506-f69d-40a4-b64f-701b8c419881-kube-api-access-4lrst" (OuterVolumeSpecName: "kube-api-access-4lrst") pod "f7006506-f69d-40a4-b64f-701b8c419881" (UID: "f7006506-f69d-40a4-b64f-701b8c419881"). InnerVolumeSpecName "kube-api-access-4lrst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.466070 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7006506-f69d-40a4-b64f-701b8c419881" (UID: "f7006506-f69d-40a4-b64f-701b8c419881"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.470382 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.470412 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lrst\" (UniqueName: \"kubernetes.io/projected/f7006506-f69d-40a4-b64f-701b8c419881-kube-api-access-4lrst\") on node \"crc\" DevicePath \"\"" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.470424 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7006506-f69d-40a4-b64f-701b8c419881-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.595888 4475 generic.go:334] "Generic (PLEG): container finished" podID="f7006506-f69d-40a4-b64f-701b8c419881" containerID="f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a" exitCode=0 Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.596065 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tggp2" event={"ID":"f7006506-f69d-40a4-b64f-701b8c419881","Type":"ContainerDied","Data":"f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a"} Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.596087 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tggp2" event={"ID":"f7006506-f69d-40a4-b64f-701b8c419881","Type":"ContainerDied","Data":"4e8c2df5a9eb5c128dc17bcfb7b7c3dbece94f6d2438d2f003fcdcb08a00bb8f"} Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.596237 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tggp2" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.596962 4475 scope.go:117] "RemoveContainer" containerID="f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.618924 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tggp2"] Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.625879 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tggp2"] Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.626965 4475 scope.go:117] "RemoveContainer" containerID="5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.644357 4475 scope.go:117] "RemoveContainer" containerID="e97af5199d72f25648957743b40ee98b62a50618fa4e2e4f43abafb85decf6f8" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.679343 4475 scope.go:117] "RemoveContainer" containerID="f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a" Dec 03 07:31:27 crc kubenswrapper[4475]: E1203 07:31:27.683057 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a\": container with ID starting with f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a not found: ID does not exist" containerID="f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.683422 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a"} err="failed to get container status \"f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a\": rpc error: code = NotFound desc = could not find container \"f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a\": container with ID starting with f472346ccaa7d539e38e353b780ca326218148c68089f05d18019294744e6d1a not found: ID does not exist" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.683480 4475 scope.go:117] "RemoveContainer" containerID="5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e" Dec 03 07:31:27 crc kubenswrapper[4475]: E1203 07:31:27.683842 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e\": container with ID starting with 5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e not found: ID does not exist" containerID="5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.683860 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e"} err="failed to get container status \"5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e\": rpc error: code = NotFound desc = could not find container \"5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e\": container with ID starting with 5f7d7bc98b9da771a3bc3845f4347f6e6989201517253c0b1d84b23382ba672e not found: ID does not exist" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.683873 4475 scope.go:117] "RemoveContainer" containerID="e97af5199d72f25648957743b40ee98b62a50618fa4e2e4f43abafb85decf6f8" Dec 03 07:31:27 crc kubenswrapper[4475]: E1203 07:31:27.684192 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97af5199d72f25648957743b40ee98b62a50618fa4e2e4f43abafb85decf6f8\": container with ID starting with e97af5199d72f25648957743b40ee98b62a50618fa4e2e4f43abafb85decf6f8 not found: ID does not exist" containerID="e97af5199d72f25648957743b40ee98b62a50618fa4e2e4f43abafb85decf6f8" Dec 03 07:31:27 crc kubenswrapper[4475]: I1203 07:31:27.684207 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97af5199d72f25648957743b40ee98b62a50618fa4e2e4f43abafb85decf6f8"} err="failed to get container status \"e97af5199d72f25648957743b40ee98b62a50618fa4e2e4f43abafb85decf6f8\": rpc error: code = NotFound desc = could not find container \"e97af5199d72f25648957743b40ee98b62a50618fa4e2e4f43abafb85decf6f8\": container with ID starting with e97af5199d72f25648957743b40ee98b62a50618fa4e2e4f43abafb85decf6f8 not found: ID does not exist" Dec 03 07:31:29 crc kubenswrapper[4475]: I1203 07:31:29.501533 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7006506-f69d-40a4-b64f-701b8c419881" path="/var/lib/kubelet/pods/f7006506-f69d-40a4-b64f-701b8c419881/volumes" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.106879 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hlt7z"] Dec 03 07:32:16 crc kubenswrapper[4475]: E1203 07:32:16.110668 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7006506-f69d-40a4-b64f-701b8c419881" containerName="extract-utilities" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.110694 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7006506-f69d-40a4-b64f-701b8c419881" containerName="extract-utilities" Dec 03 07:32:16 crc kubenswrapper[4475]: E1203 07:32:16.110723 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7006506-f69d-40a4-b64f-701b8c419881" containerName="registry-server" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.110728 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7006506-f69d-40a4-b64f-701b8c419881" containerName="registry-server" Dec 03 07:32:16 crc kubenswrapper[4475]: E1203 07:32:16.110739 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7006506-f69d-40a4-b64f-701b8c419881" containerName="extract-content" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.110744 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7006506-f69d-40a4-b64f-701b8c419881" containerName="extract-content" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.111226 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7006506-f69d-40a4-b64f-701b8c419881" containerName="registry-server" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.116947 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.261425 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ec076a-41a2-4c62-8453-8e3332888fee-catalog-content\") pod \"community-operators-hlt7z\" (UID: \"b0ec076a-41a2-4c62-8453-8e3332888fee\") " pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.261728 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ec076a-41a2-4c62-8453-8e3332888fee-utilities\") pod \"community-operators-hlt7z\" (UID: \"b0ec076a-41a2-4c62-8453-8e3332888fee\") " pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.261770 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvx9q\" (UniqueName: \"kubernetes.io/projected/b0ec076a-41a2-4c62-8453-8e3332888fee-kube-api-access-pvx9q\") pod \"community-operators-hlt7z\" (UID: \"b0ec076a-41a2-4c62-8453-8e3332888fee\") " pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.317304 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlt7z"] Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.364094 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ec076a-41a2-4c62-8453-8e3332888fee-catalog-content\") pod \"community-operators-hlt7z\" (UID: \"b0ec076a-41a2-4c62-8453-8e3332888fee\") " pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.364133 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ec076a-41a2-4c62-8453-8e3332888fee-utilities\") pod \"community-operators-hlt7z\" (UID: \"b0ec076a-41a2-4c62-8453-8e3332888fee\") " pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.364177 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvx9q\" (UniqueName: \"kubernetes.io/projected/b0ec076a-41a2-4c62-8453-8e3332888fee-kube-api-access-pvx9q\") pod \"community-operators-hlt7z\" (UID: \"b0ec076a-41a2-4c62-8453-8e3332888fee\") " pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.368122 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ec076a-41a2-4c62-8453-8e3332888fee-catalog-content\") pod \"community-operators-hlt7z\" (UID: \"b0ec076a-41a2-4c62-8453-8e3332888fee\") " pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.369195 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ec076a-41a2-4c62-8453-8e3332888fee-utilities\") pod \"community-operators-hlt7z\" (UID: \"b0ec076a-41a2-4c62-8453-8e3332888fee\") " pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.397554 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvx9q\" (UniqueName: \"kubernetes.io/projected/b0ec076a-41a2-4c62-8453-8e3332888fee-kube-api-access-pvx9q\") pod \"community-operators-hlt7z\" (UID: \"b0ec076a-41a2-4c62-8453-8e3332888fee\") " pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:16 crc kubenswrapper[4475]: I1203 07:32:16.438096 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:17 crc kubenswrapper[4475]: I1203 07:32:17.254619 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlt7z"] Dec 03 07:32:17 crc kubenswrapper[4475]: I1203 07:32:17.932159 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlt7z" event={"ID":"b0ec076a-41a2-4c62-8453-8e3332888fee","Type":"ContainerDied","Data":"00b2ab3fde8814815078251666969923e51941eadd33ad6e344b67a05fc5b413"} Dec 03 07:32:17 crc kubenswrapper[4475]: I1203 07:32:17.932716 4475 generic.go:334] "Generic (PLEG): container finished" podID="b0ec076a-41a2-4c62-8453-8e3332888fee" containerID="00b2ab3fde8814815078251666969923e51941eadd33ad6e344b67a05fc5b413" exitCode=0 Dec 03 07:32:17 crc kubenswrapper[4475]: I1203 07:32:17.932972 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlt7z" event={"ID":"b0ec076a-41a2-4c62-8453-8e3332888fee","Type":"ContainerStarted","Data":"983346612b6150c8ac59a2a6e415728c3e3ec0a129d201a6bf142a03a77f8e36"} Dec 03 07:32:22 crc kubenswrapper[4475]: I1203 07:32:22.979083 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlt7z" event={"ID":"b0ec076a-41a2-4c62-8453-8e3332888fee","Type":"ContainerStarted","Data":"2ef34bbe13cdd5557cf373808210f924431ca26069fa6601c5328722325d2ecd"} Dec 03 07:32:23 crc kubenswrapper[4475]: I1203 07:32:23.987500 4475 generic.go:334] "Generic (PLEG): container finished" podID="b0ec076a-41a2-4c62-8453-8e3332888fee" containerID="2ef34bbe13cdd5557cf373808210f924431ca26069fa6601c5328722325d2ecd" exitCode=0 Dec 03 07:32:23 crc kubenswrapper[4475]: I1203 07:32:23.987546 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlt7z" event={"ID":"b0ec076a-41a2-4c62-8453-8e3332888fee","Type":"ContainerDied","Data":"2ef34bbe13cdd5557cf373808210f924431ca26069fa6601c5328722325d2ecd"} Dec 03 07:32:24 crc kubenswrapper[4475]: I1203 07:32:24.995700 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlt7z" event={"ID":"b0ec076a-41a2-4c62-8453-8e3332888fee","Type":"ContainerStarted","Data":"de87bd9a32e7b10db6080ceff0ec3e52aff4bb3f5fd76ad59413983d3cb19eb1"} Dec 03 07:32:25 crc kubenswrapper[4475]: I1203 07:32:25.021746 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hlt7z" podStartSLOduration=3.344203194 podStartE2EDuration="10.020890623s" podCreationTimestamp="2025-12-03 07:32:15 +0000 UTC" firstStartedPulling="2025-12-03 07:32:17.933677627 +0000 UTC m=+2822.738575951" lastFinishedPulling="2025-12-03 07:32:24.610365046 +0000 UTC m=+2829.415263380" observedRunningTime="2025-12-03 07:32:25.015202919 +0000 UTC m=+2829.820101254" watchObservedRunningTime="2025-12-03 07:32:25.020890623 +0000 UTC m=+2829.825788958" Dec 03 07:32:26 crc kubenswrapper[4475]: I1203 07:32:26.439497 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:26 crc kubenswrapper[4475]: I1203 07:32:26.439724 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:27 crc kubenswrapper[4475]: I1203 07:32:27.486651 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hlt7z" podUID="b0ec076a-41a2-4c62-8453-8e3332888fee" containerName="registry-server" probeResult="failure" output=< Dec 03 07:32:27 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 07:32:27 crc kubenswrapper[4475]: > Dec 03 07:32:36 crc kubenswrapper[4475]: I1203 07:32:36.489759 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:36 crc kubenswrapper[4475]: I1203 07:32:36.531045 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hlt7z" Dec 03 07:32:36 crc kubenswrapper[4475]: I1203 07:32:36.631582 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlt7z"] Dec 03 07:32:36 crc kubenswrapper[4475]: I1203 07:32:36.729270 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s8gdr"] Dec 03 07:32:36 crc kubenswrapper[4475]: I1203 07:32:36.730530 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s8gdr" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" containerName="registry-server" containerID="cri-o://f0a1cb37c3c99a18ef363d8ef0cf1de9f693eaa36913ec74adcf9d6633ab9ab5" gracePeriod=2 Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.088581 4475 generic.go:334] "Generic (PLEG): container finished" podID="495a50fc-19f4-49e9-a195-196e75ebf30f" containerID="f0a1cb37c3c99a18ef363d8ef0cf1de9f693eaa36913ec74adcf9d6633ab9ab5" exitCode=0 Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.088777 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8gdr" event={"ID":"495a50fc-19f4-49e9-a195-196e75ebf30f","Type":"ContainerDied","Data":"f0a1cb37c3c99a18ef363d8ef0cf1de9f693eaa36913ec74adcf9d6633ab9ab5"} Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.545801 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s8gdr" Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.703792 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-utilities\") pod \"495a50fc-19f4-49e9-a195-196e75ebf30f\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.704123 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-catalog-content\") pod \"495a50fc-19f4-49e9-a195-196e75ebf30f\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.704310 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbm9x\" (UniqueName: \"kubernetes.io/projected/495a50fc-19f4-49e9-a195-196e75ebf30f-kube-api-access-hbm9x\") pod \"495a50fc-19f4-49e9-a195-196e75ebf30f\" (UID: \"495a50fc-19f4-49e9-a195-196e75ebf30f\") " Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.707094 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-utilities" (OuterVolumeSpecName: "utilities") pod "495a50fc-19f4-49e9-a195-196e75ebf30f" (UID: "495a50fc-19f4-49e9-a195-196e75ebf30f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.726262 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/495a50fc-19f4-49e9-a195-196e75ebf30f-kube-api-access-hbm9x" (OuterVolumeSpecName: "kube-api-access-hbm9x") pod "495a50fc-19f4-49e9-a195-196e75ebf30f" (UID: "495a50fc-19f4-49e9-a195-196e75ebf30f"). InnerVolumeSpecName "kube-api-access-hbm9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.772483 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "495a50fc-19f4-49e9-a195-196e75ebf30f" (UID: "495a50fc-19f4-49e9-a195-196e75ebf30f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.806137 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbm9x\" (UniqueName: \"kubernetes.io/projected/495a50fc-19f4-49e9-a195-196e75ebf30f-kube-api-access-hbm9x\") on node \"crc\" DevicePath \"\"" Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.806164 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:32:37 crc kubenswrapper[4475]: I1203 07:32:37.806175 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/495a50fc-19f4-49e9-a195-196e75ebf30f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:32:38 crc kubenswrapper[4475]: I1203 07:32:38.097948 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s8gdr" Dec 03 07:32:38 crc kubenswrapper[4475]: I1203 07:32:38.097945 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8gdr" event={"ID":"495a50fc-19f4-49e9-a195-196e75ebf30f","Type":"ContainerDied","Data":"a9ec87a32bf6edb894551fb5db41ba1323b458c07b1f91d23a445254f6310291"} Dec 03 07:32:38 crc kubenswrapper[4475]: I1203 07:32:38.099169 4475 scope.go:117] "RemoveContainer" containerID="f0a1cb37c3c99a18ef363d8ef0cf1de9f693eaa36913ec74adcf9d6633ab9ab5" Dec 03 07:32:38 crc kubenswrapper[4475]: I1203 07:32:38.129882 4475 scope.go:117] "RemoveContainer" containerID="013de5e5f407ac12eca079d900b3f766064f6cc13fd70080b6c285f545d161d1" Dec 03 07:32:38 crc kubenswrapper[4475]: I1203 07:32:38.134544 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s8gdr"] Dec 03 07:32:38 crc kubenswrapper[4475]: I1203 07:32:38.142653 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s8gdr"] Dec 03 07:32:38 crc kubenswrapper[4475]: I1203 07:32:38.158779 4475 scope.go:117] "RemoveContainer" containerID="f98b4cc7559ab2a2c445d6d08920b8953c8aca7c315446bbb70bb55a34616a01" Dec 03 07:32:39 crc kubenswrapper[4475]: I1203 07:32:39.550801 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" path="/var/lib/kubelet/pods/495a50fc-19f4-49e9-a195-196e75ebf30f/volumes" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.783573 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gtzll"] Dec 03 07:32:48 crc kubenswrapper[4475]: E1203 07:32:48.786545 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" containerName="extract-content" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.786570 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" containerName="extract-content" Dec 03 07:32:48 crc kubenswrapper[4475]: E1203 07:32:48.786597 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" containerName="extract-utilities" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.786603 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" containerName="extract-utilities" Dec 03 07:32:48 crc kubenswrapper[4475]: E1203 07:32:48.786622 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" containerName="registry-server" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.786627 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" containerName="registry-server" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.788034 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="495a50fc-19f4-49e9-a195-196e75ebf30f" containerName="registry-server" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.791627 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.838188 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gtzll"] Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.894064 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjkbk\" (UniqueName: \"kubernetes.io/projected/cc094045-e34c-4c77-86f4-24f96c18fbfe-kube-api-access-gjkbk\") pod \"redhat-operators-gtzll\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.894330 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-catalog-content\") pod \"redhat-operators-gtzll\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.894473 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-utilities\") pod \"redhat-operators-gtzll\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.996059 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-utilities\") pod \"redhat-operators-gtzll\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.996114 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjkbk\" (UniqueName: \"kubernetes.io/projected/cc094045-e34c-4c77-86f4-24f96c18fbfe-kube-api-access-gjkbk\") pod \"redhat-operators-gtzll\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.996238 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-catalog-content\") pod \"redhat-operators-gtzll\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.999218 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-utilities\") pod \"redhat-operators-gtzll\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:48 crc kubenswrapper[4475]: I1203 07:32:48.999789 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-catalog-content\") pod \"redhat-operators-gtzll\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:49 crc kubenswrapper[4475]: I1203 07:32:49.022860 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjkbk\" (UniqueName: \"kubernetes.io/projected/cc094045-e34c-4c77-86f4-24f96c18fbfe-kube-api-access-gjkbk\") pod \"redhat-operators-gtzll\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:49 crc kubenswrapper[4475]: I1203 07:32:49.110498 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:49 crc kubenswrapper[4475]: I1203 07:32:49.856403 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gtzll"] Dec 03 07:32:50 crc kubenswrapper[4475]: I1203 07:32:50.171493 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtzll" event={"ID":"cc094045-e34c-4c77-86f4-24f96c18fbfe","Type":"ContainerDied","Data":"4c9c49736435149a5251ca2a401d3d471e383c2c7f6884f27b416507f0a6c28c"} Dec 03 07:32:50 crc kubenswrapper[4475]: I1203 07:32:50.172263 4475 generic.go:334] "Generic (PLEG): container finished" podID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerID="4c9c49736435149a5251ca2a401d3d471e383c2c7f6884f27b416507f0a6c28c" exitCode=0 Dec 03 07:32:50 crc kubenswrapper[4475]: I1203 07:32:50.172314 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtzll" event={"ID":"cc094045-e34c-4c77-86f4-24f96c18fbfe","Type":"ContainerStarted","Data":"23ebc22d7540a33d9e6d7961c6ab45f6e0d18565b2a4011e2483bc11b99f60d7"} Dec 03 07:32:51 crc kubenswrapper[4475]: I1203 07:32:51.179819 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtzll" event={"ID":"cc094045-e34c-4c77-86f4-24f96c18fbfe","Type":"ContainerStarted","Data":"df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d"} Dec 03 07:32:53 crc kubenswrapper[4475]: E1203 07:32:53.672727 4475 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.177:52502->192.168.25.177:40263: write tcp 192.168.25.177:52502->192.168.25.177:40263: write: connection reset by peer Dec 03 07:32:54 crc kubenswrapper[4475]: I1203 07:32:54.209697 4475 generic.go:334] "Generic (PLEG): container finished" podID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerID="df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d" exitCode=0 Dec 03 07:32:54 crc kubenswrapper[4475]: I1203 07:32:54.209965 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtzll" event={"ID":"cc094045-e34c-4c77-86f4-24f96c18fbfe","Type":"ContainerDied","Data":"df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d"} Dec 03 07:32:55 crc kubenswrapper[4475]: I1203 07:32:55.219251 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtzll" event={"ID":"cc094045-e34c-4c77-86f4-24f96c18fbfe","Type":"ContainerStarted","Data":"cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8"} Dec 03 07:32:55 crc kubenswrapper[4475]: I1203 07:32:55.239487 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gtzll" podStartSLOduration=2.6677639810000002 podStartE2EDuration="7.239039484s" podCreationTimestamp="2025-12-03 07:32:48 +0000 UTC" firstStartedPulling="2025-12-03 07:32:50.174261253 +0000 UTC m=+2854.979159588" lastFinishedPulling="2025-12-03 07:32:54.745536757 +0000 UTC m=+2859.550435091" observedRunningTime="2025-12-03 07:32:55.234397184 +0000 UTC m=+2860.039295519" watchObservedRunningTime="2025-12-03 07:32:55.239039484 +0000 UTC m=+2860.043937817" Dec 03 07:32:58 crc kubenswrapper[4475]: I1203 07:32:58.934304 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:32:58 crc kubenswrapper[4475]: I1203 07:32:58.935114 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:32:59 crc kubenswrapper[4475]: I1203 07:32:59.111865 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:32:59 crc kubenswrapper[4475]: I1203 07:32:59.111899 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:33:00 crc kubenswrapper[4475]: I1203 07:33:00.148583 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gtzll" podUID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerName="registry-server" probeResult="failure" output=< Dec 03 07:33:00 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 07:33:00 crc kubenswrapper[4475]: > Dec 03 07:33:09 crc kubenswrapper[4475]: I1203 07:33:09.149170 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:33:09 crc kubenswrapper[4475]: I1203 07:33:09.182671 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:33:09 crc kubenswrapper[4475]: I1203 07:33:09.530560 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gtzll"] Dec 03 07:33:10 crc kubenswrapper[4475]: I1203 07:33:10.317112 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gtzll" podUID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerName="registry-server" containerID="cri-o://cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8" gracePeriod=2 Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.248325 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.273289 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjkbk\" (UniqueName: \"kubernetes.io/projected/cc094045-e34c-4c77-86f4-24f96c18fbfe-kube-api-access-gjkbk\") pod \"cc094045-e34c-4c77-86f4-24f96c18fbfe\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.273511 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-catalog-content\") pod \"cc094045-e34c-4c77-86f4-24f96c18fbfe\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.273558 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-utilities\") pod \"cc094045-e34c-4c77-86f4-24f96c18fbfe\" (UID: \"cc094045-e34c-4c77-86f4-24f96c18fbfe\") " Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.276969 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-utilities" (OuterVolumeSpecName: "utilities") pod "cc094045-e34c-4c77-86f4-24f96c18fbfe" (UID: "cc094045-e34c-4c77-86f4-24f96c18fbfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.293771 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc094045-e34c-4c77-86f4-24f96c18fbfe-kube-api-access-gjkbk" (OuterVolumeSpecName: "kube-api-access-gjkbk") pod "cc094045-e34c-4c77-86f4-24f96c18fbfe" (UID: "cc094045-e34c-4c77-86f4-24f96c18fbfe"). InnerVolumeSpecName "kube-api-access-gjkbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.328401 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtzll" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.328428 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtzll" event={"ID":"cc094045-e34c-4c77-86f4-24f96c18fbfe","Type":"ContainerDied","Data":"cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8"} Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.328313 4475 generic.go:334] "Generic (PLEG): container finished" podID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerID="cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8" exitCode=0 Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.328733 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtzll" event={"ID":"cc094045-e34c-4c77-86f4-24f96c18fbfe","Type":"ContainerDied","Data":"23ebc22d7540a33d9e6d7961c6ab45f6e0d18565b2a4011e2483bc11b99f60d7"} Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.329734 4475 scope.go:117] "RemoveContainer" containerID="cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.369319 4475 scope.go:117] "RemoveContainer" containerID="df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.380741 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.380782 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjkbk\" (UniqueName: \"kubernetes.io/projected/cc094045-e34c-4c77-86f4-24f96c18fbfe-kube-api-access-gjkbk\") on node \"crc\" DevicePath \"\"" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.409410 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc094045-e34c-4c77-86f4-24f96c18fbfe" (UID: "cc094045-e34c-4c77-86f4-24f96c18fbfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.409772 4475 scope.go:117] "RemoveContainer" containerID="4c9c49736435149a5251ca2a401d3d471e383c2c7f6884f27b416507f0a6c28c" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.437753 4475 scope.go:117] "RemoveContainer" containerID="cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8" Dec 03 07:33:11 crc kubenswrapper[4475]: E1203 07:33:11.440143 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8\": container with ID starting with cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8 not found: ID does not exist" containerID="cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.440546 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8"} err="failed to get container status \"cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8\": rpc error: code = NotFound desc = could not find container \"cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8\": container with ID starting with cb3232f41c747ac211476180a4158a3f86a14119b3ec22d2e876662c2f833db8 not found: ID does not exist" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.440585 4475 scope.go:117] "RemoveContainer" containerID="df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d" Dec 03 07:33:11 crc kubenswrapper[4475]: E1203 07:33:11.441645 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d\": container with ID starting with df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d not found: ID does not exist" containerID="df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.441681 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d"} err="failed to get container status \"df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d\": rpc error: code = NotFound desc = could not find container \"df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d\": container with ID starting with df080dc409b7bb8e7add7ac1f52983d6a431b6eff682d39b2742aa057f397a6d not found: ID does not exist" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.441705 4475 scope.go:117] "RemoveContainer" containerID="4c9c49736435149a5251ca2a401d3d471e383c2c7f6884f27b416507f0a6c28c" Dec 03 07:33:11 crc kubenswrapper[4475]: E1203 07:33:11.442030 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9c49736435149a5251ca2a401d3d471e383c2c7f6884f27b416507f0a6c28c\": container with ID starting with 4c9c49736435149a5251ca2a401d3d471e383c2c7f6884f27b416507f0a6c28c not found: ID does not exist" containerID="4c9c49736435149a5251ca2a401d3d471e383c2c7f6884f27b416507f0a6c28c" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.442049 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9c49736435149a5251ca2a401d3d471e383c2c7f6884f27b416507f0a6c28c"} err="failed to get container status \"4c9c49736435149a5251ca2a401d3d471e383c2c7f6884f27b416507f0a6c28c\": rpc error: code = NotFound desc = could not find container \"4c9c49736435149a5251ca2a401d3d471e383c2c7f6884f27b416507f0a6c28c\": container with ID starting with 4c9c49736435149a5251ca2a401d3d471e383c2c7f6884f27b416507f0a6c28c not found: ID does not exist" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.482737 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc094045-e34c-4c77-86f4-24f96c18fbfe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.646820 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gtzll"] Dec 03 07:33:11 crc kubenswrapper[4475]: I1203 07:33:11.654856 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gtzll"] Dec 03 07:33:13 crc kubenswrapper[4475]: I1203 07:33:13.499150 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc094045-e34c-4c77-86f4-24f96c18fbfe" path="/var/lib/kubelet/pods/cc094045-e34c-4c77-86f4-24f96c18fbfe/volumes" Dec 03 07:33:28 crc kubenswrapper[4475]: I1203 07:33:28.934336 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:33:28 crc kubenswrapper[4475]: I1203 07:33:28.935801 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:33:58 crc kubenswrapper[4475]: I1203 07:33:58.933960 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:33:58 crc kubenswrapper[4475]: I1203 07:33:58.934315 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:33:58 crc kubenswrapper[4475]: I1203 07:33:58.935194 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 07:33:58 crc kubenswrapper[4475]: I1203 07:33:58.936469 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38011db8441fa258228ddb37b16a7964318696935f6daa20e615955ba9070bde"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:33:58 crc kubenswrapper[4475]: I1203 07:33:58.936542 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://38011db8441fa258228ddb37b16a7964318696935f6daa20e615955ba9070bde" gracePeriod=600 Dec 03 07:33:59 crc kubenswrapper[4475]: I1203 07:33:59.661947 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"38011db8441fa258228ddb37b16a7964318696935f6daa20e615955ba9070bde"} Dec 03 07:33:59 crc kubenswrapper[4475]: I1203 07:33:59.662428 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="38011db8441fa258228ddb37b16a7964318696935f6daa20e615955ba9070bde" exitCode=0 Dec 03 07:33:59 crc kubenswrapper[4475]: I1203 07:33:59.662487 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1"} Dec 03 07:33:59 crc kubenswrapper[4475]: I1203 07:33:59.663562 4475 scope.go:117] "RemoveContainer" containerID="5cadad861f9f456c6d89da22b0f73a55e242fce31e15c0255ce2371d4734ca7f" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.139355 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x22m9"] Dec 03 07:34:00 crc kubenswrapper[4475]: E1203 07:34:00.140860 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerName="registry-server" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.140878 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerName="registry-server" Dec 03 07:34:00 crc kubenswrapper[4475]: E1203 07:34:00.141038 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerName="extract-utilities" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.141047 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerName="extract-utilities" Dec 03 07:34:00 crc kubenswrapper[4475]: E1203 07:34:00.141081 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerName="extract-content" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.141087 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerName="extract-content" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.141951 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc094045-e34c-4c77-86f4-24f96c18fbfe" containerName="registry-server" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.144916 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.236565 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x22m9"] Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.290149 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbxmt\" (UniqueName: \"kubernetes.io/projected/4fcf4871-1046-4079-b4d9-2429c8854c70-kube-api-access-lbxmt\") pod \"redhat-marketplace-x22m9\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.290203 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-catalog-content\") pod \"redhat-marketplace-x22m9\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.290252 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-utilities\") pod \"redhat-marketplace-x22m9\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.391690 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-utilities\") pod \"redhat-marketplace-x22m9\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.391831 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbxmt\" (UniqueName: \"kubernetes.io/projected/4fcf4871-1046-4079-b4d9-2429c8854c70-kube-api-access-lbxmt\") pod \"redhat-marketplace-x22m9\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.391861 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-catalog-content\") pod \"redhat-marketplace-x22m9\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.393267 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-catalog-content\") pod \"redhat-marketplace-x22m9\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.394997 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-utilities\") pod \"redhat-marketplace-x22m9\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.418709 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbxmt\" (UniqueName: \"kubernetes.io/projected/4fcf4871-1046-4079-b4d9-2429c8854c70-kube-api-access-lbxmt\") pod \"redhat-marketplace-x22m9\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:00 crc kubenswrapper[4475]: I1203 07:34:00.471245 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:01 crc kubenswrapper[4475]: I1203 07:34:01.229263 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x22m9"] Dec 03 07:34:01 crc kubenswrapper[4475]: I1203 07:34:01.677479 4475 generic.go:334] "Generic (PLEG): container finished" podID="4fcf4871-1046-4079-b4d9-2429c8854c70" containerID="d8b75d93967bdb3f052b783a47093f9d1b59ac21f2012557eb1a740f4f425af0" exitCode=0 Dec 03 07:34:01 crc kubenswrapper[4475]: I1203 07:34:01.677512 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22m9" event={"ID":"4fcf4871-1046-4079-b4d9-2429c8854c70","Type":"ContainerDied","Data":"d8b75d93967bdb3f052b783a47093f9d1b59ac21f2012557eb1a740f4f425af0"} Dec 03 07:34:01 crc kubenswrapper[4475]: I1203 07:34:01.677546 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22m9" event={"ID":"4fcf4871-1046-4079-b4d9-2429c8854c70","Type":"ContainerStarted","Data":"715bd4e90b65aeb0c5d9225da29caffee9a0eea718a6f93cffe127697a4dfe33"} Dec 03 07:34:02 crc kubenswrapper[4475]: I1203 07:34:02.691402 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22m9" event={"ID":"4fcf4871-1046-4079-b4d9-2429c8854c70","Type":"ContainerStarted","Data":"ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa"} Dec 03 07:34:03 crc kubenswrapper[4475]: I1203 07:34:03.699769 4475 generic.go:334] "Generic (PLEG): container finished" podID="4fcf4871-1046-4079-b4d9-2429c8854c70" containerID="ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa" exitCode=0 Dec 03 07:34:03 crc kubenswrapper[4475]: I1203 07:34:03.699954 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22m9" event={"ID":"4fcf4871-1046-4079-b4d9-2429c8854c70","Type":"ContainerDied","Data":"ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa"} Dec 03 07:34:04 crc kubenswrapper[4475]: I1203 07:34:04.708730 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22m9" event={"ID":"4fcf4871-1046-4079-b4d9-2429c8854c70","Type":"ContainerStarted","Data":"f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27"} Dec 03 07:34:04 crc kubenswrapper[4475]: I1203 07:34:04.724385 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x22m9" podStartSLOduration=2.225403824 podStartE2EDuration="4.723695447s" podCreationTimestamp="2025-12-03 07:34:00 +0000 UTC" firstStartedPulling="2025-12-03 07:34:01.679831372 +0000 UTC m=+2926.484729706" lastFinishedPulling="2025-12-03 07:34:04.178122995 +0000 UTC m=+2928.983021329" observedRunningTime="2025-12-03 07:34:04.722900742 +0000 UTC m=+2929.527799076" watchObservedRunningTime="2025-12-03 07:34:04.723695447 +0000 UTC m=+2929.528593780" Dec 03 07:34:10 crc kubenswrapper[4475]: I1203 07:34:10.472138 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:10 crc kubenswrapper[4475]: I1203 07:34:10.472546 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:10 crc kubenswrapper[4475]: I1203 07:34:10.504644 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:10 crc kubenswrapper[4475]: I1203 07:34:10.777765 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:10 crc kubenswrapper[4475]: I1203 07:34:10.826294 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x22m9"] Dec 03 07:34:12 crc kubenswrapper[4475]: I1203 07:34:12.759821 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x22m9" podUID="4fcf4871-1046-4079-b4d9-2429c8854c70" containerName="registry-server" containerID="cri-o://f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27" gracePeriod=2 Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.289615 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.401929 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-utilities\") pod \"4fcf4871-1046-4079-b4d9-2429c8854c70\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.402047 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-catalog-content\") pod \"4fcf4871-1046-4079-b4d9-2429c8854c70\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.402066 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbxmt\" (UniqueName: \"kubernetes.io/projected/4fcf4871-1046-4079-b4d9-2429c8854c70-kube-api-access-lbxmt\") pod \"4fcf4871-1046-4079-b4d9-2429c8854c70\" (UID: \"4fcf4871-1046-4079-b4d9-2429c8854c70\") " Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.405890 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-utilities" (OuterVolumeSpecName: "utilities") pod "4fcf4871-1046-4079-b4d9-2429c8854c70" (UID: "4fcf4871-1046-4079-b4d9-2429c8854c70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.418403 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fcf4871-1046-4079-b4d9-2429c8854c70-kube-api-access-lbxmt" (OuterVolumeSpecName: "kube-api-access-lbxmt") pod "4fcf4871-1046-4079-b4d9-2429c8854c70" (UID: "4fcf4871-1046-4079-b4d9-2429c8854c70"). InnerVolumeSpecName "kube-api-access-lbxmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.429383 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4fcf4871-1046-4079-b4d9-2429c8854c70" (UID: "4fcf4871-1046-4079-b4d9-2429c8854c70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.504178 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.504205 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fcf4871-1046-4079-b4d9-2429c8854c70-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.504216 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbxmt\" (UniqueName: \"kubernetes.io/projected/4fcf4871-1046-4079-b4d9-2429c8854c70-kube-api-access-lbxmt\") on node \"crc\" DevicePath \"\"" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.766871 4475 generic.go:334] "Generic (PLEG): container finished" podID="4fcf4871-1046-4079-b4d9-2429c8854c70" containerID="f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27" exitCode=0 Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.766906 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22m9" event={"ID":"4fcf4871-1046-4079-b4d9-2429c8854c70","Type":"ContainerDied","Data":"f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27"} Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.766930 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22m9" event={"ID":"4fcf4871-1046-4079-b4d9-2429c8854c70","Type":"ContainerDied","Data":"715bd4e90b65aeb0c5d9225da29caffee9a0eea718a6f93cffe127697a4dfe33"} Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.766948 4475 scope.go:117] "RemoveContainer" containerID="f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.767099 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x22m9" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.789581 4475 scope.go:117] "RemoveContainer" containerID="ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.790702 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x22m9"] Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.801141 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x22m9"] Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.808379 4475 scope.go:117] "RemoveContainer" containerID="d8b75d93967bdb3f052b783a47093f9d1b59ac21f2012557eb1a740f4f425af0" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.841608 4475 scope.go:117] "RemoveContainer" containerID="f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27" Dec 03 07:34:13 crc kubenswrapper[4475]: E1203 07:34:13.844768 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27\": container with ID starting with f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27 not found: ID does not exist" containerID="f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.844808 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27"} err="failed to get container status \"f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27\": rpc error: code = NotFound desc = could not find container \"f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27\": container with ID starting with f49a5bfb8fcc03422bd3037a4b9b5c3a643ffcdcc43d5174f9c046b3b6d70b27 not found: ID does not exist" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.844832 4475 scope.go:117] "RemoveContainer" containerID="ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa" Dec 03 07:34:13 crc kubenswrapper[4475]: E1203 07:34:13.845104 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa\": container with ID starting with ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa not found: ID does not exist" containerID="ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.845124 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa"} err="failed to get container status \"ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa\": rpc error: code = NotFound desc = could not find container \"ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa\": container with ID starting with ef087b8959039c2541e32256a2df98b67cebce5397b3a55172a51c2e1f5b13aa not found: ID does not exist" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.845136 4475 scope.go:117] "RemoveContainer" containerID="d8b75d93967bdb3f052b783a47093f9d1b59ac21f2012557eb1a740f4f425af0" Dec 03 07:34:13 crc kubenswrapper[4475]: E1203 07:34:13.845387 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8b75d93967bdb3f052b783a47093f9d1b59ac21f2012557eb1a740f4f425af0\": container with ID starting with d8b75d93967bdb3f052b783a47093f9d1b59ac21f2012557eb1a740f4f425af0 not found: ID does not exist" containerID="d8b75d93967bdb3f052b783a47093f9d1b59ac21f2012557eb1a740f4f425af0" Dec 03 07:34:13 crc kubenswrapper[4475]: I1203 07:34:13.845417 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8b75d93967bdb3f052b783a47093f9d1b59ac21f2012557eb1a740f4f425af0"} err="failed to get container status \"d8b75d93967bdb3f052b783a47093f9d1b59ac21f2012557eb1a740f4f425af0\": rpc error: code = NotFound desc = could not find container \"d8b75d93967bdb3f052b783a47093f9d1b59ac21f2012557eb1a740f4f425af0\": container with ID starting with d8b75d93967bdb3f052b783a47093f9d1b59ac21f2012557eb1a740f4f425af0 not found: ID does not exist" Dec 03 07:34:15 crc kubenswrapper[4475]: I1203 07:34:15.499684 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fcf4871-1046-4079-b4d9-2429c8854c70" path="/var/lib/kubelet/pods/4fcf4871-1046-4079-b4d9-2429c8854c70/volumes" Dec 03 07:36:28 crc kubenswrapper[4475]: I1203 07:36:28.934759 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:36:28 crc kubenswrapper[4475]: I1203 07:36:28.935339 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:36:58 crc kubenswrapper[4475]: I1203 07:36:58.933327 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:36:58 crc kubenswrapper[4475]: I1203 07:36:58.934339 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:37:28 crc kubenswrapper[4475]: I1203 07:37:28.934031 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:37:28 crc kubenswrapper[4475]: I1203 07:37:28.934471 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:37:28 crc kubenswrapper[4475]: I1203 07:37:28.934730 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 07:37:28 crc kubenswrapper[4475]: I1203 07:37:28.936071 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:37:28 crc kubenswrapper[4475]: I1203 07:37:28.936516 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" gracePeriod=600 Dec 03 07:37:29 crc kubenswrapper[4475]: E1203 07:37:29.080967 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:37:29 crc kubenswrapper[4475]: I1203 07:37:29.990152 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" exitCode=0 Dec 03 07:37:29 crc kubenswrapper[4475]: I1203 07:37:29.990189 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1"} Dec 03 07:37:29 crc kubenswrapper[4475]: I1203 07:37:29.991624 4475 scope.go:117] "RemoveContainer" containerID="38011db8441fa258228ddb37b16a7964318696935f6daa20e615955ba9070bde" Dec 03 07:37:29 crc kubenswrapper[4475]: I1203 07:37:29.991702 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:37:29 crc kubenswrapper[4475]: E1203 07:37:29.991935 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:37:43 crc kubenswrapper[4475]: I1203 07:37:43.491099 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:37:43 crc kubenswrapper[4475]: E1203 07:37:43.491809 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:37:54 crc kubenswrapper[4475]: I1203 07:37:54.491413 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:37:54 crc kubenswrapper[4475]: E1203 07:37:54.491942 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:38:07 crc kubenswrapper[4475]: I1203 07:38:07.490852 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:38:07 crc kubenswrapper[4475]: E1203 07:38:07.491332 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:38:21 crc kubenswrapper[4475]: I1203 07:38:21.491657 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:38:21 crc kubenswrapper[4475]: E1203 07:38:21.492386 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:38:34 crc kubenswrapper[4475]: I1203 07:38:34.491318 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:38:34 crc kubenswrapper[4475]: E1203 07:38:34.491823 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:38:45 crc kubenswrapper[4475]: I1203 07:38:45.496553 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:38:45 crc kubenswrapper[4475]: E1203 07:38:45.497949 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:38:58 crc kubenswrapper[4475]: I1203 07:38:58.491626 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:38:58 crc kubenswrapper[4475]: E1203 07:38:58.492901 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:39:12 crc kubenswrapper[4475]: I1203 07:39:12.491676 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:39:12 crc kubenswrapper[4475]: E1203 07:39:12.492174 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:39:26 crc kubenswrapper[4475]: I1203 07:39:26.491731 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:39:26 crc kubenswrapper[4475]: E1203 07:39:26.492937 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:39:41 crc kubenswrapper[4475]: I1203 07:39:41.490935 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:39:41 crc kubenswrapper[4475]: E1203 07:39:41.491795 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:39:52 crc kubenswrapper[4475]: I1203 07:39:52.490681 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:39:52 crc kubenswrapper[4475]: E1203 07:39:52.491979 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:40:07 crc kubenswrapper[4475]: I1203 07:40:07.491300 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:40:07 crc kubenswrapper[4475]: E1203 07:40:07.491898 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:40:18 crc kubenswrapper[4475]: I1203 07:40:18.491913 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:40:18 crc kubenswrapper[4475]: E1203 07:40:18.492513 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:40:29 crc kubenswrapper[4475]: I1203 07:40:29.491895 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:40:29 crc kubenswrapper[4475]: E1203 07:40:29.492460 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:40:41 crc kubenswrapper[4475]: I1203 07:40:41.491477 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:40:41 crc kubenswrapper[4475]: E1203 07:40:41.491995 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:40:55 crc kubenswrapper[4475]: I1203 07:40:55.496462 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:40:55 crc kubenswrapper[4475]: E1203 07:40:55.496990 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:41:10 crc kubenswrapper[4475]: I1203 07:41:10.491291 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:41:10 crc kubenswrapper[4475]: E1203 07:41:10.492079 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:41:23 crc kubenswrapper[4475]: I1203 07:41:23.491501 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:41:23 crc kubenswrapper[4475]: E1203 07:41:23.492040 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:41:38 crc kubenswrapper[4475]: I1203 07:41:38.491038 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:41:38 crc kubenswrapper[4475]: E1203 07:41:38.491569 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:41:51 crc kubenswrapper[4475]: I1203 07:41:51.491297 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:41:51 crc kubenswrapper[4475]: E1203 07:41:51.492046 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:41:59.999782 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8gl2p"] Dec 03 07:42:00 crc kubenswrapper[4475]: E1203 07:42:00.006593 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fcf4871-1046-4079-b4d9-2429c8854c70" containerName="extract-utilities" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.006616 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fcf4871-1046-4079-b4d9-2429c8854c70" containerName="extract-utilities" Dec 03 07:42:00 crc kubenswrapper[4475]: E1203 07:42:00.006796 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fcf4871-1046-4079-b4d9-2429c8854c70" containerName="extract-content" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.006811 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fcf4871-1046-4079-b4d9-2429c8854c70" containerName="extract-content" Dec 03 07:42:00 crc kubenswrapper[4475]: E1203 07:42:00.006829 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fcf4871-1046-4079-b4d9-2429c8854c70" containerName="registry-server" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.006835 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fcf4871-1046-4079-b4d9-2429c8854c70" containerName="registry-server" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.007812 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fcf4871-1046-4079-b4d9-2429c8854c70" containerName="registry-server" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.010920 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.092697 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8gl2p"] Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.180488 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-utilities\") pod \"certified-operators-8gl2p\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.180687 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fsf9\" (UniqueName: \"kubernetes.io/projected/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-kube-api-access-6fsf9\") pod \"certified-operators-8gl2p\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.180833 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-catalog-content\") pod \"certified-operators-8gl2p\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.282087 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-catalog-content\") pod \"certified-operators-8gl2p\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.282204 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-utilities\") pod \"certified-operators-8gl2p\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.282226 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fsf9\" (UniqueName: \"kubernetes.io/projected/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-kube-api-access-6fsf9\") pod \"certified-operators-8gl2p\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.285723 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-utilities\") pod \"certified-operators-8gl2p\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.286056 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-catalog-content\") pod \"certified-operators-8gl2p\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.308170 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fsf9\" (UniqueName: \"kubernetes.io/projected/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-kube-api-access-6fsf9\") pod \"certified-operators-8gl2p\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:00 crc kubenswrapper[4475]: I1203 07:42:00.334194 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:01 crc kubenswrapper[4475]: I1203 07:42:01.068788 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8gl2p"] Dec 03 07:42:01 crc kubenswrapper[4475]: I1203 07:42:01.707712 4475 generic.go:334] "Generic (PLEG): container finished" podID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" containerID="6c6b4c929304d179ee3fd1e1f8bbc68aec3af98adad957dca2496cd27f49457a" exitCode=0 Dec 03 07:42:01 crc kubenswrapper[4475]: I1203 07:42:01.707795 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gl2p" event={"ID":"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c","Type":"ContainerDied","Data":"6c6b4c929304d179ee3fd1e1f8bbc68aec3af98adad957dca2496cd27f49457a"} Dec 03 07:42:01 crc kubenswrapper[4475]: I1203 07:42:01.707929 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gl2p" event={"ID":"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c","Type":"ContainerStarted","Data":"1199c2a7eb910a6c37e815d83bc50164d4c2c443ceca519b85e09e183c792754"} Dec 03 07:42:01 crc kubenswrapper[4475]: I1203 07:42:01.712428 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:42:02 crc kubenswrapper[4475]: I1203 07:42:02.715868 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gl2p" event={"ID":"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c","Type":"ContainerStarted","Data":"78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17"} Dec 03 07:42:03 crc kubenswrapper[4475]: I1203 07:42:03.724886 4475 generic.go:334] "Generic (PLEG): container finished" podID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" containerID="78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17" exitCode=0 Dec 03 07:42:03 crc kubenswrapper[4475]: I1203 07:42:03.724969 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gl2p" event={"ID":"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c","Type":"ContainerDied","Data":"78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17"} Dec 03 07:42:04 crc kubenswrapper[4475]: I1203 07:42:04.734416 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gl2p" event={"ID":"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c","Type":"ContainerStarted","Data":"c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819"} Dec 03 07:42:04 crc kubenswrapper[4475]: I1203 07:42:04.752804 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8gl2p" podStartSLOduration=3.289608718 podStartE2EDuration="5.752361588s" podCreationTimestamp="2025-12-03 07:41:59 +0000 UTC" firstStartedPulling="2025-12-03 07:42:01.709200427 +0000 UTC m=+3406.514098951" lastFinishedPulling="2025-12-03 07:42:04.171953487 +0000 UTC m=+3408.976851821" observedRunningTime="2025-12-03 07:42:04.749411662 +0000 UTC m=+3409.554310016" watchObservedRunningTime="2025-12-03 07:42:04.752361588 +0000 UTC m=+3409.557259922" Dec 03 07:42:05 crc kubenswrapper[4475]: I1203 07:42:05.501102 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:42:05 crc kubenswrapper[4475]: E1203 07:42:05.501563 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:42:10 crc kubenswrapper[4475]: I1203 07:42:10.334656 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:10 crc kubenswrapper[4475]: I1203 07:42:10.335651 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:10 crc kubenswrapper[4475]: I1203 07:42:10.371713 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:10 crc kubenswrapper[4475]: I1203 07:42:10.803296 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:10 crc kubenswrapper[4475]: I1203 07:42:10.837033 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8gl2p"] Dec 03 07:42:12 crc kubenswrapper[4475]: I1203 07:42:12.783887 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8gl2p" podUID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" containerName="registry-server" containerID="cri-o://c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819" gracePeriod=2 Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.221108 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.300910 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-catalog-content\") pod \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.301092 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fsf9\" (UniqueName: \"kubernetes.io/projected/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-kube-api-access-6fsf9\") pod \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.301228 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-utilities\") pod \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\" (UID: \"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c\") " Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.302580 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-utilities" (OuterVolumeSpecName: "utilities") pod "b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" (UID: "b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.308874 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-kube-api-access-6fsf9" (OuterVolumeSpecName: "kube-api-access-6fsf9") pod "b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" (UID: "b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c"). InnerVolumeSpecName "kube-api-access-6fsf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.340783 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" (UID: "b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.404231 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fsf9\" (UniqueName: \"kubernetes.io/projected/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-kube-api-access-6fsf9\") on node \"crc\" DevicePath \"\"" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.404261 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.404272 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.791344 4475 generic.go:334] "Generic (PLEG): container finished" podID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" containerID="c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819" exitCode=0 Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.791385 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gl2p" event={"ID":"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c","Type":"ContainerDied","Data":"c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819"} Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.791410 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gl2p" event={"ID":"b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c","Type":"ContainerDied","Data":"1199c2a7eb910a6c37e815d83bc50164d4c2c443ceca519b85e09e183c792754"} Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.791428 4475 scope.go:117] "RemoveContainer" containerID="c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.791425 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gl2p" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.808570 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8gl2p"] Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.811235 4475 scope.go:117] "RemoveContainer" containerID="78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.818569 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8gl2p"] Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.831968 4475 scope.go:117] "RemoveContainer" containerID="6c6b4c929304d179ee3fd1e1f8bbc68aec3af98adad957dca2496cd27f49457a" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.871528 4475 scope.go:117] "RemoveContainer" containerID="c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819" Dec 03 07:42:13 crc kubenswrapper[4475]: E1203 07:42:13.872945 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819\": container with ID starting with c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819 not found: ID does not exist" containerID="c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.873295 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819"} err="failed to get container status \"c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819\": rpc error: code = NotFound desc = could not find container \"c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819\": container with ID starting with c950dcc4598916959b60d69d4745227766c7bf916312692cbd1ccdfe8bbdf819 not found: ID does not exist" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.873322 4475 scope.go:117] "RemoveContainer" containerID="78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17" Dec 03 07:42:13 crc kubenswrapper[4475]: E1203 07:42:13.873743 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17\": container with ID starting with 78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17 not found: ID does not exist" containerID="78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.873773 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17"} err="failed to get container status \"78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17\": rpc error: code = NotFound desc = could not find container \"78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17\": container with ID starting with 78bd89d8597ec96ce93233b5dae30a7451118b0202bea9300e035a2a7f807e17 not found: ID does not exist" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.873796 4475 scope.go:117] "RemoveContainer" containerID="6c6b4c929304d179ee3fd1e1f8bbc68aec3af98adad957dca2496cd27f49457a" Dec 03 07:42:13 crc kubenswrapper[4475]: E1203 07:42:13.874148 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c6b4c929304d179ee3fd1e1f8bbc68aec3af98adad957dca2496cd27f49457a\": container with ID starting with 6c6b4c929304d179ee3fd1e1f8bbc68aec3af98adad957dca2496cd27f49457a not found: ID does not exist" containerID="6c6b4c929304d179ee3fd1e1f8bbc68aec3af98adad957dca2496cd27f49457a" Dec 03 07:42:13 crc kubenswrapper[4475]: I1203 07:42:13.874224 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c6b4c929304d179ee3fd1e1f8bbc68aec3af98adad957dca2496cd27f49457a"} err="failed to get container status \"6c6b4c929304d179ee3fd1e1f8bbc68aec3af98adad957dca2496cd27f49457a\": rpc error: code = NotFound desc = could not find container \"6c6b4c929304d179ee3fd1e1f8bbc68aec3af98adad957dca2496cd27f49457a\": container with ID starting with 6c6b4c929304d179ee3fd1e1f8bbc68aec3af98adad957dca2496cd27f49457a not found: ID does not exist" Dec 03 07:42:15 crc kubenswrapper[4475]: I1203 07:42:15.498830 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" path="/var/lib/kubelet/pods/b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c/volumes" Dec 03 07:42:19 crc kubenswrapper[4475]: I1203 07:42:19.491584 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:42:19 crc kubenswrapper[4475]: E1203 07:42:19.492097 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.491414 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.755945 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m2882"] Dec 03 07:42:33 crc kubenswrapper[4475]: E1203 07:42:33.758363 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" containerName="extract-content" Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.758380 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" containerName="extract-content" Dec 03 07:42:33 crc kubenswrapper[4475]: E1203 07:42:33.758392 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" containerName="extract-utilities" Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.758398 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" containerName="extract-utilities" Dec 03 07:42:33 crc kubenswrapper[4475]: E1203 07:42:33.758416 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" containerName="registry-server" Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.758421 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" containerName="registry-server" Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.758618 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9baa15a-ea3e-47ec-b3a1-11a83fd8b38c" containerName="registry-server" Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.761102 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.775176 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2882"] Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.907478 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg98n\" (UniqueName: \"kubernetes.io/projected/6eb08d92-2424-4829-9c94-b9ae344322c0-kube-api-access-bg98n\") pod \"community-operators-m2882\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.907692 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-catalog-content\") pod \"community-operators-m2882\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.907877 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-utilities\") pod \"community-operators-m2882\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:33 crc kubenswrapper[4475]: I1203 07:42:33.928995 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"3695e88ada63d499e9a8b61f83dfed5c133f2615e66634fbdd03d323806353a7"} Dec 03 07:42:34 crc kubenswrapper[4475]: I1203 07:42:34.009038 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-catalog-content\") pod \"community-operators-m2882\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:34 crc kubenswrapper[4475]: I1203 07:42:34.009269 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-utilities\") pod \"community-operators-m2882\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:34 crc kubenswrapper[4475]: I1203 07:42:34.009345 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg98n\" (UniqueName: \"kubernetes.io/projected/6eb08d92-2424-4829-9c94-b9ae344322c0-kube-api-access-bg98n\") pod \"community-operators-m2882\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:34 crc kubenswrapper[4475]: I1203 07:42:34.009972 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-catalog-content\") pod \"community-operators-m2882\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:34 crc kubenswrapper[4475]: I1203 07:42:34.010166 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-utilities\") pod \"community-operators-m2882\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:34 crc kubenswrapper[4475]: I1203 07:42:34.032194 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg98n\" (UniqueName: \"kubernetes.io/projected/6eb08d92-2424-4829-9c94-b9ae344322c0-kube-api-access-bg98n\") pod \"community-operators-m2882\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:34 crc kubenswrapper[4475]: I1203 07:42:34.077481 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:34 crc kubenswrapper[4475]: I1203 07:42:34.485670 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2882"] Dec 03 07:42:34 crc kubenswrapper[4475]: I1203 07:42:34.936421 4475 generic.go:334] "Generic (PLEG): container finished" podID="6eb08d92-2424-4829-9c94-b9ae344322c0" containerID="2a89d86542ddc31f5556ba9c571def10e6f6e5e2bef60effa2e0fe75e9564dc8" exitCode=0 Dec 03 07:42:34 crc kubenswrapper[4475]: I1203 07:42:34.936513 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2882" event={"ID":"6eb08d92-2424-4829-9c94-b9ae344322c0","Type":"ContainerDied","Data":"2a89d86542ddc31f5556ba9c571def10e6f6e5e2bef60effa2e0fe75e9564dc8"} Dec 03 07:42:34 crc kubenswrapper[4475]: I1203 07:42:34.936639 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2882" event={"ID":"6eb08d92-2424-4829-9c94-b9ae344322c0","Type":"ContainerStarted","Data":"0685c1cb71a4d19152284b3b7b843a76cfcfcd793903ef6516cec855a03e6803"} Dec 03 07:42:35 crc kubenswrapper[4475]: I1203 07:42:35.946305 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2882" event={"ID":"6eb08d92-2424-4829-9c94-b9ae344322c0","Type":"ContainerStarted","Data":"4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c"} Dec 03 07:42:36 crc kubenswrapper[4475]: I1203 07:42:36.955081 4475 generic.go:334] "Generic (PLEG): container finished" podID="6eb08d92-2424-4829-9c94-b9ae344322c0" containerID="4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c" exitCode=0 Dec 03 07:42:36 crc kubenswrapper[4475]: I1203 07:42:36.955120 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2882" event={"ID":"6eb08d92-2424-4829-9c94-b9ae344322c0","Type":"ContainerDied","Data":"4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c"} Dec 03 07:42:37 crc kubenswrapper[4475]: I1203 07:42:37.964155 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2882" event={"ID":"6eb08d92-2424-4829-9c94-b9ae344322c0","Type":"ContainerStarted","Data":"505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810"} Dec 03 07:42:37 crc kubenswrapper[4475]: I1203 07:42:37.981468 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m2882" podStartSLOduration=2.245405919 podStartE2EDuration="4.981432147s" podCreationTimestamp="2025-12-03 07:42:33 +0000 UTC" firstStartedPulling="2025-12-03 07:42:34.93882053 +0000 UTC m=+3439.743718863" lastFinishedPulling="2025-12-03 07:42:37.674846767 +0000 UTC m=+3442.479745091" observedRunningTime="2025-12-03 07:42:37.977391301 +0000 UTC m=+3442.782289635" watchObservedRunningTime="2025-12-03 07:42:37.981432147 +0000 UTC m=+3442.786330482" Dec 03 07:42:44 crc kubenswrapper[4475]: I1203 07:42:44.078079 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:44 crc kubenswrapper[4475]: I1203 07:42:44.078411 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:44 crc kubenswrapper[4475]: I1203 07:42:44.112342 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:45 crc kubenswrapper[4475]: I1203 07:42:45.053643 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:45 crc kubenswrapper[4475]: I1203 07:42:45.101663 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m2882"] Dec 03 07:42:47 crc kubenswrapper[4475]: I1203 07:42:47.031911 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m2882" podUID="6eb08d92-2424-4829-9c94-b9ae344322c0" containerName="registry-server" containerID="cri-o://505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810" gracePeriod=2 Dec 03 07:42:47 crc kubenswrapper[4475]: I1203 07:42:47.446573 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:47 crc kubenswrapper[4475]: I1203 07:42:47.520231 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-utilities\") pod \"6eb08d92-2424-4829-9c94-b9ae344322c0\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " Dec 03 07:42:47 crc kubenswrapper[4475]: I1203 07:42:47.520426 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg98n\" (UniqueName: \"kubernetes.io/projected/6eb08d92-2424-4829-9c94-b9ae344322c0-kube-api-access-bg98n\") pod \"6eb08d92-2424-4829-9c94-b9ae344322c0\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " Dec 03 07:42:47 crc kubenswrapper[4475]: I1203 07:42:47.520474 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-catalog-content\") pod \"6eb08d92-2424-4829-9c94-b9ae344322c0\" (UID: \"6eb08d92-2424-4829-9c94-b9ae344322c0\") " Dec 03 07:42:47 crc kubenswrapper[4475]: I1203 07:42:47.520794 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-utilities" (OuterVolumeSpecName: "utilities") pod "6eb08d92-2424-4829-9c94-b9ae344322c0" (UID: "6eb08d92-2424-4829-9c94-b9ae344322c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:42:47 crc kubenswrapper[4475]: I1203 07:42:47.521428 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:42:47 crc kubenswrapper[4475]: I1203 07:42:47.529284 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb08d92-2424-4829-9c94-b9ae344322c0-kube-api-access-bg98n" (OuterVolumeSpecName: "kube-api-access-bg98n") pod "6eb08d92-2424-4829-9c94-b9ae344322c0" (UID: "6eb08d92-2424-4829-9c94-b9ae344322c0"). InnerVolumeSpecName "kube-api-access-bg98n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:42:47 crc kubenswrapper[4475]: I1203 07:42:47.562744 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6eb08d92-2424-4829-9c94-b9ae344322c0" (UID: "6eb08d92-2424-4829-9c94-b9ae344322c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:42:47 crc kubenswrapper[4475]: I1203 07:42:47.622797 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg98n\" (UniqueName: \"kubernetes.io/projected/6eb08d92-2424-4829-9c94-b9ae344322c0-kube-api-access-bg98n\") on node \"crc\" DevicePath \"\"" Dec 03 07:42:47 crc kubenswrapper[4475]: I1203 07:42:47.622824 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb08d92-2424-4829-9c94-b9ae344322c0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.039063 4475 generic.go:334] "Generic (PLEG): container finished" podID="6eb08d92-2424-4829-9c94-b9ae344322c0" containerID="505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810" exitCode=0 Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.039142 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2882" Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.039145 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2882" event={"ID":"6eb08d92-2424-4829-9c94-b9ae344322c0","Type":"ContainerDied","Data":"505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810"} Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.039410 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2882" event={"ID":"6eb08d92-2424-4829-9c94-b9ae344322c0","Type":"ContainerDied","Data":"0685c1cb71a4d19152284b3b7b843a76cfcfcd793903ef6516cec855a03e6803"} Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.039448 4475 scope.go:117] "RemoveContainer" containerID="505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810" Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.057149 4475 scope.go:117] "RemoveContainer" containerID="4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c" Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.068664 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m2882"] Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.073501 4475 scope.go:117] "RemoveContainer" containerID="2a89d86542ddc31f5556ba9c571def10e6f6e5e2bef60effa2e0fe75e9564dc8" Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.076380 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m2882"] Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.104076 4475 scope.go:117] "RemoveContainer" containerID="505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810" Dec 03 07:42:48 crc kubenswrapper[4475]: E1203 07:42:48.104402 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810\": container with ID starting with 505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810 not found: ID does not exist" containerID="505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810" Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.104464 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810"} err="failed to get container status \"505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810\": rpc error: code = NotFound desc = could not find container \"505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810\": container with ID starting with 505d81eaf8d39bdd450ca30c5530e4f1f574178bbe5365b3762756ab33bb4810 not found: ID does not exist" Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.104491 4475 scope.go:117] "RemoveContainer" containerID="4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c" Dec 03 07:42:48 crc kubenswrapper[4475]: E1203 07:42:48.104762 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c\": container with ID starting with 4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c not found: ID does not exist" containerID="4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c" Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.104791 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c"} err="failed to get container status \"4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c\": rpc error: code = NotFound desc = could not find container \"4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c\": container with ID starting with 4a33be16da1b36291d8efb0849e7f5f8c1a8a52156c50f2e540b376edd80875c not found: ID does not exist" Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.104812 4475 scope.go:117] "RemoveContainer" containerID="2a89d86542ddc31f5556ba9c571def10e6f6e5e2bef60effa2e0fe75e9564dc8" Dec 03 07:42:48 crc kubenswrapper[4475]: E1203 07:42:48.105103 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a89d86542ddc31f5556ba9c571def10e6f6e5e2bef60effa2e0fe75e9564dc8\": container with ID starting with 2a89d86542ddc31f5556ba9c571def10e6f6e5e2bef60effa2e0fe75e9564dc8 not found: ID does not exist" containerID="2a89d86542ddc31f5556ba9c571def10e6f6e5e2bef60effa2e0fe75e9564dc8" Dec 03 07:42:48 crc kubenswrapper[4475]: I1203 07:42:48.105188 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a89d86542ddc31f5556ba9c571def10e6f6e5e2bef60effa2e0fe75e9564dc8"} err="failed to get container status \"2a89d86542ddc31f5556ba9c571def10e6f6e5e2bef60effa2e0fe75e9564dc8\": rpc error: code = NotFound desc = could not find container \"2a89d86542ddc31f5556ba9c571def10e6f6e5e2bef60effa2e0fe75e9564dc8\": container with ID starting with 2a89d86542ddc31f5556ba9c571def10e6f6e5e2bef60effa2e0fe75e9564dc8 not found: ID does not exist" Dec 03 07:42:49 crc kubenswrapper[4475]: I1203 07:42:49.503415 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb08d92-2424-4829-9c94-b9ae344322c0" path="/var/lib/kubelet/pods/6eb08d92-2424-4829-9c94-b9ae344322c0/volumes" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.591978 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cd759"] Dec 03 07:43:22 crc kubenswrapper[4475]: E1203 07:43:22.592624 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb08d92-2424-4829-9c94-b9ae344322c0" containerName="extract-utilities" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.592636 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb08d92-2424-4829-9c94-b9ae344322c0" containerName="extract-utilities" Dec 03 07:43:22 crc kubenswrapper[4475]: E1203 07:43:22.592654 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb08d92-2424-4829-9c94-b9ae344322c0" containerName="registry-server" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.592659 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb08d92-2424-4829-9c94-b9ae344322c0" containerName="registry-server" Dec 03 07:43:22 crc kubenswrapper[4475]: E1203 07:43:22.592672 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb08d92-2424-4829-9c94-b9ae344322c0" containerName="extract-content" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.592678 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb08d92-2424-4829-9c94-b9ae344322c0" containerName="extract-content" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.592842 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb08d92-2424-4829-9c94-b9ae344322c0" containerName="registry-server" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.593897 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.601825 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cd759"] Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.605878 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-utilities\") pod \"redhat-operators-cd759\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.606163 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp6sr\" (UniqueName: \"kubernetes.io/projected/83d12558-e78c-43b9-8c3a-886da7dd3a4f-kube-api-access-kp6sr\") pod \"redhat-operators-cd759\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.606201 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-catalog-content\") pod \"redhat-operators-cd759\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.707891 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp6sr\" (UniqueName: \"kubernetes.io/projected/83d12558-e78c-43b9-8c3a-886da7dd3a4f-kube-api-access-kp6sr\") pod \"redhat-operators-cd759\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.707940 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-catalog-content\") pod \"redhat-operators-cd759\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.708250 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-utilities\") pod \"redhat-operators-cd759\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.708666 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-catalog-content\") pod \"redhat-operators-cd759\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.708706 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-utilities\") pod \"redhat-operators-cd759\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.723216 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp6sr\" (UniqueName: \"kubernetes.io/projected/83d12558-e78c-43b9-8c3a-886da7dd3a4f-kube-api-access-kp6sr\") pod \"redhat-operators-cd759\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:22 crc kubenswrapper[4475]: I1203 07:43:22.907036 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:23 crc kubenswrapper[4475]: I1203 07:43:23.323651 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cd759"] Dec 03 07:43:24 crc kubenswrapper[4475]: I1203 07:43:24.270119 4475 generic.go:334] "Generic (PLEG): container finished" podID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerID="2e8cf2cf12eb90ba28011d01ff2f5b1a45609eb963390c9a28b1b5f77fa4ee74" exitCode=0 Dec 03 07:43:24 crc kubenswrapper[4475]: I1203 07:43:24.270343 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cd759" event={"ID":"83d12558-e78c-43b9-8c3a-886da7dd3a4f","Type":"ContainerDied","Data":"2e8cf2cf12eb90ba28011d01ff2f5b1a45609eb963390c9a28b1b5f77fa4ee74"} Dec 03 07:43:24 crc kubenswrapper[4475]: I1203 07:43:24.271103 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cd759" event={"ID":"83d12558-e78c-43b9-8c3a-886da7dd3a4f","Type":"ContainerStarted","Data":"3df983dbde85af91a9534eda7a9cf792dfb8b144f24dd7d96560ca971dda3ae7"} Dec 03 07:43:25 crc kubenswrapper[4475]: I1203 07:43:25.294782 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cd759" event={"ID":"83d12558-e78c-43b9-8c3a-886da7dd3a4f","Type":"ContainerStarted","Data":"2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a"} Dec 03 07:43:28 crc kubenswrapper[4475]: I1203 07:43:28.315257 4475 generic.go:334] "Generic (PLEG): container finished" podID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerID="2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a" exitCode=0 Dec 03 07:43:28 crc kubenswrapper[4475]: I1203 07:43:28.315333 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cd759" event={"ID":"83d12558-e78c-43b9-8c3a-886da7dd3a4f","Type":"ContainerDied","Data":"2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a"} Dec 03 07:43:29 crc kubenswrapper[4475]: I1203 07:43:29.323888 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cd759" event={"ID":"83d12558-e78c-43b9-8c3a-886da7dd3a4f","Type":"ContainerStarted","Data":"b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a"} Dec 03 07:43:29 crc kubenswrapper[4475]: I1203 07:43:29.341841 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cd759" podStartSLOduration=2.829776513 podStartE2EDuration="7.341829435s" podCreationTimestamp="2025-12-03 07:43:22 +0000 UTC" firstStartedPulling="2025-12-03 07:43:24.271855654 +0000 UTC m=+3489.076753988" lastFinishedPulling="2025-12-03 07:43:28.783908577 +0000 UTC m=+3493.588806910" observedRunningTime="2025-12-03 07:43:29.335208618 +0000 UTC m=+3494.140106952" watchObservedRunningTime="2025-12-03 07:43:29.341829435 +0000 UTC m=+3494.146727768" Dec 03 07:43:32 crc kubenswrapper[4475]: I1203 07:43:32.908056 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:32 crc kubenswrapper[4475]: I1203 07:43:32.908438 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:33 crc kubenswrapper[4475]: I1203 07:43:33.946903 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cd759" podUID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerName="registry-server" probeResult="failure" output=< Dec 03 07:43:33 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 07:43:33 crc kubenswrapper[4475]: > Dec 03 07:43:42 crc kubenswrapper[4475]: I1203 07:43:42.943653 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:42 crc kubenswrapper[4475]: I1203 07:43:42.979828 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:43 crc kubenswrapper[4475]: I1203 07:43:43.175928 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cd759"] Dec 03 07:43:44 crc kubenswrapper[4475]: I1203 07:43:44.414993 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cd759" podUID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerName="registry-server" containerID="cri-o://b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a" gracePeriod=2 Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.037023 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.070173 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-utilities\") pod \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.070207 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-catalog-content\") pod \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.070331 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp6sr\" (UniqueName: \"kubernetes.io/projected/83d12558-e78c-43b9-8c3a-886da7dd3a4f-kube-api-access-kp6sr\") pod \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\" (UID: \"83d12558-e78c-43b9-8c3a-886da7dd3a4f\") " Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.070865 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-utilities" (OuterVolumeSpecName: "utilities") pod "83d12558-e78c-43b9-8c3a-886da7dd3a4f" (UID: "83d12558-e78c-43b9-8c3a-886da7dd3a4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.079046 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83d12558-e78c-43b9-8c3a-886da7dd3a4f-kube-api-access-kp6sr" (OuterVolumeSpecName: "kube-api-access-kp6sr") pod "83d12558-e78c-43b9-8c3a-886da7dd3a4f" (UID: "83d12558-e78c-43b9-8c3a-886da7dd3a4f"). InnerVolumeSpecName "kube-api-access-kp6sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.143126 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83d12558-e78c-43b9-8c3a-886da7dd3a4f" (UID: "83d12558-e78c-43b9-8c3a-886da7dd3a4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.172092 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.172124 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d12558-e78c-43b9-8c3a-886da7dd3a4f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.172135 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp6sr\" (UniqueName: \"kubernetes.io/projected/83d12558-e78c-43b9-8c3a-886da7dd3a4f-kube-api-access-kp6sr\") on node \"crc\" DevicePath \"\"" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.423032 4475 generic.go:334] "Generic (PLEG): container finished" podID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerID="b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a" exitCode=0 Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.423068 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cd759" event={"ID":"83d12558-e78c-43b9-8c3a-886da7dd3a4f","Type":"ContainerDied","Data":"b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a"} Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.423112 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cd759" event={"ID":"83d12558-e78c-43b9-8c3a-886da7dd3a4f","Type":"ContainerDied","Data":"3df983dbde85af91a9534eda7a9cf792dfb8b144f24dd7d96560ca971dda3ae7"} Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.423130 4475 scope.go:117] "RemoveContainer" containerID="b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.423277 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cd759" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.439315 4475 scope.go:117] "RemoveContainer" containerID="2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.453672 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cd759"] Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.460316 4475 scope.go:117] "RemoveContainer" containerID="2e8cf2cf12eb90ba28011d01ff2f5b1a45609eb963390c9a28b1b5f77fa4ee74" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.461084 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cd759"] Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.491685 4475 scope.go:117] "RemoveContainer" containerID="b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a" Dec 03 07:43:45 crc kubenswrapper[4475]: E1203 07:43:45.492082 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a\": container with ID starting with b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a not found: ID does not exist" containerID="b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.492111 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a"} err="failed to get container status \"b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a\": rpc error: code = NotFound desc = could not find container \"b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a\": container with ID starting with b416ae3aa82c40f00affa3a76cd3598c19f33dd4ea63a767510e4772aa1f3d5a not found: ID does not exist" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.492127 4475 scope.go:117] "RemoveContainer" containerID="2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a" Dec 03 07:43:45 crc kubenswrapper[4475]: E1203 07:43:45.492348 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a\": container with ID starting with 2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a not found: ID does not exist" containerID="2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.492370 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a"} err="failed to get container status \"2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a\": rpc error: code = NotFound desc = could not find container \"2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a\": container with ID starting with 2d829d46b0e7a382a791af0e893ea634a6935096ba33fbab34cf33da29d4293a not found: ID does not exist" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.492382 4475 scope.go:117] "RemoveContainer" containerID="2e8cf2cf12eb90ba28011d01ff2f5b1a45609eb963390c9a28b1b5f77fa4ee74" Dec 03 07:43:45 crc kubenswrapper[4475]: E1203 07:43:45.492676 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8cf2cf12eb90ba28011d01ff2f5b1a45609eb963390c9a28b1b5f77fa4ee74\": container with ID starting with 2e8cf2cf12eb90ba28011d01ff2f5b1a45609eb963390c9a28b1b5f77fa4ee74 not found: ID does not exist" containerID="2e8cf2cf12eb90ba28011d01ff2f5b1a45609eb963390c9a28b1b5f77fa4ee74" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.492698 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8cf2cf12eb90ba28011d01ff2f5b1a45609eb963390c9a28b1b5f77fa4ee74"} err="failed to get container status \"2e8cf2cf12eb90ba28011d01ff2f5b1a45609eb963390c9a28b1b5f77fa4ee74\": rpc error: code = NotFound desc = could not find container \"2e8cf2cf12eb90ba28011d01ff2f5b1a45609eb963390c9a28b1b5f77fa4ee74\": container with ID starting with 2e8cf2cf12eb90ba28011d01ff2f5b1a45609eb963390c9a28b1b5f77fa4ee74 not found: ID does not exist" Dec 03 07:43:45 crc kubenswrapper[4475]: I1203 07:43:45.499743 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" path="/var/lib/kubelet/pods/83d12558-e78c-43b9-8c3a-886da7dd3a4f/volumes" Dec 03 07:44:58 crc kubenswrapper[4475]: I1203 07:44:58.933711 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:44:58 crc kubenswrapper[4475]: I1203 07:44:58.935076 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.191595 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9"] Dec 03 07:45:00 crc kubenswrapper[4475]: E1203 07:45:00.192114 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerName="extract-utilities" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.192126 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerName="extract-utilities" Dec 03 07:45:00 crc kubenswrapper[4475]: E1203 07:45:00.192138 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.192144 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4475]: E1203 07:45:00.192152 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerName="extract-content" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.192158 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerName="extract-content" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.192296 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="83d12558-e78c-43b9-8c3a-886da7dd3a4f" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.192851 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.201868 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.201878 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.203774 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9"] Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.270531 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0de99f57-7c36-4c39-8f80-20bd559c9757-secret-volume\") pod \"collect-profiles-29412465-q87w9\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.270578 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98s5w\" (UniqueName: \"kubernetes.io/projected/0de99f57-7c36-4c39-8f80-20bd559c9757-kube-api-access-98s5w\") pod \"collect-profiles-29412465-q87w9\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.270671 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0de99f57-7c36-4c39-8f80-20bd559c9757-config-volume\") pod \"collect-profiles-29412465-q87w9\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.372431 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0de99f57-7c36-4c39-8f80-20bd559c9757-secret-volume\") pod \"collect-profiles-29412465-q87w9\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.372526 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98s5w\" (UniqueName: \"kubernetes.io/projected/0de99f57-7c36-4c39-8f80-20bd559c9757-kube-api-access-98s5w\") pod \"collect-profiles-29412465-q87w9\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.372613 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0de99f57-7c36-4c39-8f80-20bd559c9757-config-volume\") pod \"collect-profiles-29412465-q87w9\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.373311 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0de99f57-7c36-4c39-8f80-20bd559c9757-config-volume\") pod \"collect-profiles-29412465-q87w9\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.377930 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0de99f57-7c36-4c39-8f80-20bd559c9757-secret-volume\") pod \"collect-profiles-29412465-q87w9\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.385954 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98s5w\" (UniqueName: \"kubernetes.io/projected/0de99f57-7c36-4c39-8f80-20bd559c9757-kube-api-access-98s5w\") pod \"collect-profiles-29412465-q87w9\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.507906 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:00 crc kubenswrapper[4475]: I1203 07:45:00.906544 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9"] Dec 03 07:45:01 crc kubenswrapper[4475]: I1203 07:45:01.881187 4475 generic.go:334] "Generic (PLEG): container finished" podID="0de99f57-7c36-4c39-8f80-20bd559c9757" containerID="536148c65f7868734a70346ed943fb935f828740cd0ddff963a8d1b3fae66624" exitCode=0 Dec 03 07:45:01 crc kubenswrapper[4475]: I1203 07:45:01.881303 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" event={"ID":"0de99f57-7c36-4c39-8f80-20bd559c9757","Type":"ContainerDied","Data":"536148c65f7868734a70346ed943fb935f828740cd0ddff963a8d1b3fae66624"} Dec 03 07:45:01 crc kubenswrapper[4475]: I1203 07:45:01.881543 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" event={"ID":"0de99f57-7c36-4c39-8f80-20bd559c9757","Type":"ContainerStarted","Data":"23f7b2f04f92566efa0d9a1771ae8da51a55ad028efb29838b15c20f64a14176"} Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.209199 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.324886 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98s5w\" (UniqueName: \"kubernetes.io/projected/0de99f57-7c36-4c39-8f80-20bd559c9757-kube-api-access-98s5w\") pod \"0de99f57-7c36-4c39-8f80-20bd559c9757\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.325027 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0de99f57-7c36-4c39-8f80-20bd559c9757-secret-volume\") pod \"0de99f57-7c36-4c39-8f80-20bd559c9757\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.325072 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0de99f57-7c36-4c39-8f80-20bd559c9757-config-volume\") pod \"0de99f57-7c36-4c39-8f80-20bd559c9757\" (UID: \"0de99f57-7c36-4c39-8f80-20bd559c9757\") " Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.325642 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0de99f57-7c36-4c39-8f80-20bd559c9757-config-volume" (OuterVolumeSpecName: "config-volume") pod "0de99f57-7c36-4c39-8f80-20bd559c9757" (UID: "0de99f57-7c36-4c39-8f80-20bd559c9757"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.330228 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0de99f57-7c36-4c39-8f80-20bd559c9757-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0de99f57-7c36-4c39-8f80-20bd559c9757" (UID: "0de99f57-7c36-4c39-8f80-20bd559c9757"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.330267 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0de99f57-7c36-4c39-8f80-20bd559c9757-kube-api-access-98s5w" (OuterVolumeSpecName: "kube-api-access-98s5w") pod "0de99f57-7c36-4c39-8f80-20bd559c9757" (UID: "0de99f57-7c36-4c39-8f80-20bd559c9757"). InnerVolumeSpecName "kube-api-access-98s5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.427484 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98s5w\" (UniqueName: \"kubernetes.io/projected/0de99f57-7c36-4c39-8f80-20bd559c9757-kube-api-access-98s5w\") on node \"crc\" DevicePath \"\"" Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.427509 4475 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0de99f57-7c36-4c39-8f80-20bd559c9757-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.427520 4475 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0de99f57-7c36-4c39-8f80-20bd559c9757-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.894696 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" event={"ID":"0de99f57-7c36-4c39-8f80-20bd559c9757","Type":"ContainerDied","Data":"23f7b2f04f92566efa0d9a1771ae8da51a55ad028efb29838b15c20f64a14176"} Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.894733 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f7b2f04f92566efa0d9a1771ae8da51a55ad028efb29838b15c20f64a14176" Dec 03 07:45:03 crc kubenswrapper[4475]: I1203 07:45:03.894746 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9" Dec 03 07:45:04 crc kubenswrapper[4475]: I1203 07:45:04.268581 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl"] Dec 03 07:45:04 crc kubenswrapper[4475]: I1203 07:45:04.274310 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-2tfcl"] Dec 03 07:45:05 crc kubenswrapper[4475]: E1203 07:45:05.044818 4475 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.177:46578->192.168.25.177:40263: write tcp 192.168.25.177:46578->192.168.25.177:40263: write: broken pipe Dec 03 07:45:05 crc kubenswrapper[4475]: I1203 07:45:05.499776 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de8472e-1891-4e1c-8db7-fb458c212969" path="/var/lib/kubelet/pods/9de8472e-1891-4e1c-8db7-fb458c212969/volumes" Dec 03 07:45:28 crc kubenswrapper[4475]: I1203 07:45:28.933842 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:45:28 crc kubenswrapper[4475]: I1203 07:45:28.934832 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:45:29 crc kubenswrapper[4475]: I1203 07:45:29.629644 4475 scope.go:117] "RemoveContainer" containerID="0c66f8f84c01b8f5a3385cd650dba391c2dde27470dadf4f5c132b51254c7071" Dec 03 07:45:58 crc kubenswrapper[4475]: I1203 07:45:58.933723 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:45:58 crc kubenswrapper[4475]: I1203 07:45:58.934061 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:45:58 crc kubenswrapper[4475]: I1203 07:45:58.934103 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 07:45:58 crc kubenswrapper[4475]: I1203 07:45:58.934741 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3695e88ada63d499e9a8b61f83dfed5c133f2615e66634fbdd03d323806353a7"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:45:58 crc kubenswrapper[4475]: I1203 07:45:58.934791 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://3695e88ada63d499e9a8b61f83dfed5c133f2615e66634fbdd03d323806353a7" gracePeriod=600 Dec 03 07:45:59 crc kubenswrapper[4475]: I1203 07:45:59.257734 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="3695e88ada63d499e9a8b61f83dfed5c133f2615e66634fbdd03d323806353a7" exitCode=0 Dec 03 07:45:59 crc kubenswrapper[4475]: I1203 07:45:59.257810 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"3695e88ada63d499e9a8b61f83dfed5c133f2615e66634fbdd03d323806353a7"} Dec 03 07:45:59 crc kubenswrapper[4475]: I1203 07:45:59.257904 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4"} Dec 03 07:45:59 crc kubenswrapper[4475]: I1203 07:45:59.257923 4475 scope.go:117] "RemoveContainer" containerID="dffbe6b147869881f4e9260f18420918b1c970e08be8aa0d2d9b606d54067de1" Dec 03 07:48:28 crc kubenswrapper[4475]: I1203 07:48:28.933775 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:48:28 crc kubenswrapper[4475]: I1203 07:48:28.934138 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:48:58 crc kubenswrapper[4475]: I1203 07:48:58.933628 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:48:58 crc kubenswrapper[4475]: I1203 07:48:58.933989 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:49:28 crc kubenswrapper[4475]: I1203 07:49:28.933835 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:49:28 crc kubenswrapper[4475]: I1203 07:49:28.934320 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:49:28 crc kubenswrapper[4475]: I1203 07:49:28.934356 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 07:49:28 crc kubenswrapper[4475]: I1203 07:49:28.934868 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:49:28 crc kubenswrapper[4475]: I1203 07:49:28.934914 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" gracePeriod=600 Dec 03 07:49:29 crc kubenswrapper[4475]: E1203 07:49:29.080765 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:49:29 crc kubenswrapper[4475]: I1203 07:49:29.596473 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" exitCode=0 Dec 03 07:49:29 crc kubenswrapper[4475]: I1203 07:49:29.596521 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4"} Dec 03 07:49:29 crc kubenswrapper[4475]: I1203 07:49:29.596766 4475 scope.go:117] "RemoveContainer" containerID="3695e88ada63d499e9a8b61f83dfed5c133f2615e66634fbdd03d323806353a7" Dec 03 07:49:29 crc kubenswrapper[4475]: I1203 07:49:29.597712 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:49:29 crc kubenswrapper[4475]: E1203 07:49:29.598018 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:49:41 crc kubenswrapper[4475]: I1203 07:49:41.491088 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:49:41 crc kubenswrapper[4475]: E1203 07:49:41.491615 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:49:54 crc kubenswrapper[4475]: I1203 07:49:54.491396 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:49:54 crc kubenswrapper[4475]: E1203 07:49:54.491915 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:50:06 crc kubenswrapper[4475]: I1203 07:50:06.490756 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:50:06 crc kubenswrapper[4475]: E1203 07:50:06.491248 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:50:18 crc kubenswrapper[4475]: I1203 07:50:18.492147 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:50:18 crc kubenswrapper[4475]: E1203 07:50:18.492706 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.185676 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v7n6c"] Dec 03 07:50:21 crc kubenswrapper[4475]: E1203 07:50:21.186339 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de99f57-7c36-4c39-8f80-20bd559c9757" containerName="collect-profiles" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.186351 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de99f57-7c36-4c39-8f80-20bd559c9757" containerName="collect-profiles" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.186619 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="0de99f57-7c36-4c39-8f80-20bd559c9757" containerName="collect-profiles" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.189688 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.193734 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7n6c"] Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.261767 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-utilities\") pod \"redhat-marketplace-v7n6c\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.261919 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd9dl\" (UniqueName: \"kubernetes.io/projected/e548e2af-4eba-448a-86dc-38b4efab9921-kube-api-access-rd9dl\") pod \"redhat-marketplace-v7n6c\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.261941 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-catalog-content\") pod \"redhat-marketplace-v7n6c\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.363234 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9dl\" (UniqueName: \"kubernetes.io/projected/e548e2af-4eba-448a-86dc-38b4efab9921-kube-api-access-rd9dl\") pod \"redhat-marketplace-v7n6c\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.363272 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-catalog-content\") pod \"redhat-marketplace-v7n6c\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.363340 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-utilities\") pod \"redhat-marketplace-v7n6c\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.364201 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-catalog-content\") pod \"redhat-marketplace-v7n6c\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.364434 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-utilities\") pod \"redhat-marketplace-v7n6c\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.380551 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9dl\" (UniqueName: \"kubernetes.io/projected/e548e2af-4eba-448a-86dc-38b4efab9921-kube-api-access-rd9dl\") pod \"redhat-marketplace-v7n6c\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.507383 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:21 crc kubenswrapper[4475]: I1203 07:50:21.947524 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7n6c"] Dec 03 07:50:22 crc kubenswrapper[4475]: I1203 07:50:22.911796 4475 generic.go:334] "Generic (PLEG): container finished" podID="e548e2af-4eba-448a-86dc-38b4efab9921" containerID="9f3ca28162c9b232ed8d21d95df3fd9507b366095f10a822eb9af43dd15670e8" exitCode=0 Dec 03 07:50:22 crc kubenswrapper[4475]: I1203 07:50:22.911897 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7n6c" event={"ID":"e548e2af-4eba-448a-86dc-38b4efab9921","Type":"ContainerDied","Data":"9f3ca28162c9b232ed8d21d95df3fd9507b366095f10a822eb9af43dd15670e8"} Dec 03 07:50:22 crc kubenswrapper[4475]: I1203 07:50:22.912917 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7n6c" event={"ID":"e548e2af-4eba-448a-86dc-38b4efab9921","Type":"ContainerStarted","Data":"6ccebf5dacd24b820db4ba5553c15a00a96d47f76a2125aec4ef2a53a894108a"} Dec 03 07:50:22 crc kubenswrapper[4475]: I1203 07:50:22.914171 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:50:24 crc kubenswrapper[4475]: I1203 07:50:24.934485 4475 generic.go:334] "Generic (PLEG): container finished" podID="e548e2af-4eba-448a-86dc-38b4efab9921" containerID="30e86da29a28fad924d0182b2f4257fc95ea24615b1c193c7576a473b9ef4c52" exitCode=0 Dec 03 07:50:24 crc kubenswrapper[4475]: I1203 07:50:24.934537 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7n6c" event={"ID":"e548e2af-4eba-448a-86dc-38b4efab9921","Type":"ContainerDied","Data":"30e86da29a28fad924d0182b2f4257fc95ea24615b1c193c7576a473b9ef4c52"} Dec 03 07:50:25 crc kubenswrapper[4475]: I1203 07:50:25.942585 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7n6c" event={"ID":"e548e2af-4eba-448a-86dc-38b4efab9921","Type":"ContainerStarted","Data":"854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0"} Dec 03 07:50:25 crc kubenswrapper[4475]: I1203 07:50:25.959112 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v7n6c" podStartSLOduration=2.4851507760000002 podStartE2EDuration="4.959099818s" podCreationTimestamp="2025-12-03 07:50:21 +0000 UTC" firstStartedPulling="2025-12-03 07:50:22.91395132 +0000 UTC m=+3907.718849653" lastFinishedPulling="2025-12-03 07:50:25.387900361 +0000 UTC m=+3910.192798695" observedRunningTime="2025-12-03 07:50:25.955314621 +0000 UTC m=+3910.760212955" watchObservedRunningTime="2025-12-03 07:50:25.959099818 +0000 UTC m=+3910.763998153" Dec 03 07:50:31 crc kubenswrapper[4475]: I1203 07:50:31.508067 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:31 crc kubenswrapper[4475]: I1203 07:50:31.508424 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:31 crc kubenswrapper[4475]: I1203 07:50:31.542842 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:32 crc kubenswrapper[4475]: I1203 07:50:32.121376 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:32 crc kubenswrapper[4475]: I1203 07:50:32.157259 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7n6c"] Dec 03 07:50:32 crc kubenswrapper[4475]: I1203 07:50:32.490980 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:50:32 crc kubenswrapper[4475]: E1203 07:50:32.491314 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:50:33 crc kubenswrapper[4475]: I1203 07:50:33.993947 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v7n6c" podUID="e548e2af-4eba-448a-86dc-38b4efab9921" containerName="registry-server" containerID="cri-o://854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0" gracePeriod=2 Dec 03 07:50:34 crc kubenswrapper[4475]: I1203 07:50:34.569278 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:34 crc kubenswrapper[4475]: I1203 07:50:34.689858 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-catalog-content\") pod \"e548e2af-4eba-448a-86dc-38b4efab9921\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " Dec 03 07:50:34 crc kubenswrapper[4475]: I1203 07:50:34.690149 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-utilities\") pod \"e548e2af-4eba-448a-86dc-38b4efab9921\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " Dec 03 07:50:34 crc kubenswrapper[4475]: I1203 07:50:34.690302 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd9dl\" (UniqueName: \"kubernetes.io/projected/e548e2af-4eba-448a-86dc-38b4efab9921-kube-api-access-rd9dl\") pod \"e548e2af-4eba-448a-86dc-38b4efab9921\" (UID: \"e548e2af-4eba-448a-86dc-38b4efab9921\") " Dec 03 07:50:34 crc kubenswrapper[4475]: I1203 07:50:34.693660 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-utilities" (OuterVolumeSpecName: "utilities") pod "e548e2af-4eba-448a-86dc-38b4efab9921" (UID: "e548e2af-4eba-448a-86dc-38b4efab9921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:50:34 crc kubenswrapper[4475]: I1203 07:50:34.710883 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e548e2af-4eba-448a-86dc-38b4efab9921" (UID: "e548e2af-4eba-448a-86dc-38b4efab9921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:50:34 crc kubenswrapper[4475]: I1203 07:50:34.718070 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e548e2af-4eba-448a-86dc-38b4efab9921-kube-api-access-rd9dl" (OuterVolumeSpecName: "kube-api-access-rd9dl") pod "e548e2af-4eba-448a-86dc-38b4efab9921" (UID: "e548e2af-4eba-448a-86dc-38b4efab9921"). InnerVolumeSpecName "kube-api-access-rd9dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:50:34 crc kubenswrapper[4475]: I1203 07:50:34.792355 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:50:34 crc kubenswrapper[4475]: I1203 07:50:34.792386 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd9dl\" (UniqueName: \"kubernetes.io/projected/e548e2af-4eba-448a-86dc-38b4efab9921-kube-api-access-rd9dl\") on node \"crc\" DevicePath \"\"" Dec 03 07:50:34 crc kubenswrapper[4475]: I1203 07:50:34.792396 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e548e2af-4eba-448a-86dc-38b4efab9921-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.002356 4475 generic.go:334] "Generic (PLEG): container finished" podID="e548e2af-4eba-448a-86dc-38b4efab9921" containerID="854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0" exitCode=0 Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.002395 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7n6c" event={"ID":"e548e2af-4eba-448a-86dc-38b4efab9921","Type":"ContainerDied","Data":"854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0"} Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.002414 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7n6c" Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.002435 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7n6c" event={"ID":"e548e2af-4eba-448a-86dc-38b4efab9921","Type":"ContainerDied","Data":"6ccebf5dacd24b820db4ba5553c15a00a96d47f76a2125aec4ef2a53a894108a"} Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.002475 4475 scope.go:117] "RemoveContainer" containerID="854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0" Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.037722 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7n6c"] Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.054135 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7n6c"] Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.054361 4475 scope.go:117] "RemoveContainer" containerID="30e86da29a28fad924d0182b2f4257fc95ea24615b1c193c7576a473b9ef4c52" Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.083171 4475 scope.go:117] "RemoveContainer" containerID="9f3ca28162c9b232ed8d21d95df3fd9507b366095f10a822eb9af43dd15670e8" Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.118619 4475 scope.go:117] "RemoveContainer" containerID="854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0" Dec 03 07:50:35 crc kubenswrapper[4475]: E1203 07:50:35.119025 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0\": container with ID starting with 854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0 not found: ID does not exist" containerID="854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0" Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.119058 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0"} err="failed to get container status \"854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0\": rpc error: code = NotFound desc = could not find container \"854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0\": container with ID starting with 854adce373bc945bd963d712fc1080223c39abf90b883325f362abe45000aac0 not found: ID does not exist" Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.119080 4475 scope.go:117] "RemoveContainer" containerID="30e86da29a28fad924d0182b2f4257fc95ea24615b1c193c7576a473b9ef4c52" Dec 03 07:50:35 crc kubenswrapper[4475]: E1203 07:50:35.120087 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30e86da29a28fad924d0182b2f4257fc95ea24615b1c193c7576a473b9ef4c52\": container with ID starting with 30e86da29a28fad924d0182b2f4257fc95ea24615b1c193c7576a473b9ef4c52 not found: ID does not exist" containerID="30e86da29a28fad924d0182b2f4257fc95ea24615b1c193c7576a473b9ef4c52" Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.120115 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30e86da29a28fad924d0182b2f4257fc95ea24615b1c193c7576a473b9ef4c52"} err="failed to get container status \"30e86da29a28fad924d0182b2f4257fc95ea24615b1c193c7576a473b9ef4c52\": rpc error: code = NotFound desc = could not find container \"30e86da29a28fad924d0182b2f4257fc95ea24615b1c193c7576a473b9ef4c52\": container with ID starting with 30e86da29a28fad924d0182b2f4257fc95ea24615b1c193c7576a473b9ef4c52 not found: ID does not exist" Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.120129 4475 scope.go:117] "RemoveContainer" containerID="9f3ca28162c9b232ed8d21d95df3fd9507b366095f10a822eb9af43dd15670e8" Dec 03 07:50:35 crc kubenswrapper[4475]: E1203 07:50:35.120414 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f3ca28162c9b232ed8d21d95df3fd9507b366095f10a822eb9af43dd15670e8\": container with ID starting with 9f3ca28162c9b232ed8d21d95df3fd9507b366095f10a822eb9af43dd15670e8 not found: ID does not exist" containerID="9f3ca28162c9b232ed8d21d95df3fd9507b366095f10a822eb9af43dd15670e8" Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.120482 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f3ca28162c9b232ed8d21d95df3fd9507b366095f10a822eb9af43dd15670e8"} err="failed to get container status \"9f3ca28162c9b232ed8d21d95df3fd9507b366095f10a822eb9af43dd15670e8\": rpc error: code = NotFound desc = could not find container \"9f3ca28162c9b232ed8d21d95df3fd9507b366095f10a822eb9af43dd15670e8\": container with ID starting with 9f3ca28162c9b232ed8d21d95df3fd9507b366095f10a822eb9af43dd15670e8 not found: ID does not exist" Dec 03 07:50:35 crc kubenswrapper[4475]: I1203 07:50:35.500174 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e548e2af-4eba-448a-86dc-38b4efab9921" path="/var/lib/kubelet/pods/e548e2af-4eba-448a-86dc-38b4efab9921/volumes" Dec 03 07:50:47 crc kubenswrapper[4475]: I1203 07:50:47.491805 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:50:47 crc kubenswrapper[4475]: E1203 07:50:47.492358 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:51:02 crc kubenswrapper[4475]: I1203 07:51:02.491793 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:51:02 crc kubenswrapper[4475]: E1203 07:51:02.493350 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:51:15 crc kubenswrapper[4475]: I1203 07:51:15.496376 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:51:15 crc kubenswrapper[4475]: E1203 07:51:15.497116 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:51:27 crc kubenswrapper[4475]: I1203 07:51:27.491617 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:51:27 crc kubenswrapper[4475]: E1203 07:51:27.492199 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:51:40 crc kubenswrapper[4475]: I1203 07:51:40.491555 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:51:40 crc kubenswrapper[4475]: E1203 07:51:40.492147 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:51:51 crc kubenswrapper[4475]: I1203 07:51:51.491780 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:51:51 crc kubenswrapper[4475]: E1203 07:51:51.492339 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:52:03 crc kubenswrapper[4475]: I1203 07:52:03.490706 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:52:03 crc kubenswrapper[4475]: E1203 07:52:03.491445 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:52:16 crc kubenswrapper[4475]: I1203 07:52:16.491477 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:52:16 crc kubenswrapper[4475]: E1203 07:52:16.491969 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:52:29 crc kubenswrapper[4475]: I1203 07:52:29.491496 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:52:29 crc kubenswrapper[4475]: E1203 07:52:29.492229 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:52:41 crc kubenswrapper[4475]: I1203 07:52:41.495044 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:52:41 crc kubenswrapper[4475]: E1203 07:52:41.495637 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:52:53 crc kubenswrapper[4475]: I1203 07:52:53.491682 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:52:53 crc kubenswrapper[4475]: E1203 07:52:53.492258 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:53:06 crc kubenswrapper[4475]: I1203 07:53:06.491492 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:53:06 crc kubenswrapper[4475]: E1203 07:53:06.491989 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:53:20 crc kubenswrapper[4475]: I1203 07:53:20.491752 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:53:20 crc kubenswrapper[4475]: E1203 07:53:20.492672 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:53:35 crc kubenswrapper[4475]: I1203 07:53:35.495660 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:53:35 crc kubenswrapper[4475]: E1203 07:53:35.496268 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:53:48 crc kubenswrapper[4475]: I1203 07:53:48.491085 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:53:48 crc kubenswrapper[4475]: E1203 07:53:48.491819 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:54:00 crc kubenswrapper[4475]: I1203 07:54:00.491235 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:54:00 crc kubenswrapper[4475]: E1203 07:54:00.491765 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:54:12 crc kubenswrapper[4475]: I1203 07:54:12.491367 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:54:12 crc kubenswrapper[4475]: E1203 07:54:12.492099 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:54:26 crc kubenswrapper[4475]: I1203 07:54:26.491632 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:54:26 crc kubenswrapper[4475]: E1203 07:54:26.492112 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 07:54:37 crc kubenswrapper[4475]: I1203 07:54:37.491830 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 07:54:38 crc kubenswrapper[4475]: I1203 07:54:38.540944 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"5fd6239910e877dca8b0b4140c098fd399692464452ceed3964f6b7732bcf773"} Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.012178 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v6pfm"] Dec 03 07:54:57 crc kubenswrapper[4475]: E1203 07:54:57.040678 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e548e2af-4eba-448a-86dc-38b4efab9921" containerName="registry-server" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.040714 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e548e2af-4eba-448a-86dc-38b4efab9921" containerName="registry-server" Dec 03 07:54:57 crc kubenswrapper[4475]: E1203 07:54:57.040734 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e548e2af-4eba-448a-86dc-38b4efab9921" containerName="extract-utilities" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.040740 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e548e2af-4eba-448a-86dc-38b4efab9921" containerName="extract-utilities" Dec 03 07:54:57 crc kubenswrapper[4475]: E1203 07:54:57.040761 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e548e2af-4eba-448a-86dc-38b4efab9921" containerName="extract-content" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.040767 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e548e2af-4eba-448a-86dc-38b4efab9921" containerName="extract-content" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.041016 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="e548e2af-4eba-448a-86dc-38b4efab9921" containerName="registry-server" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.050429 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v6pfm"] Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.050546 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.127553 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8grm\" (UniqueName: \"kubernetes.io/projected/809cde9f-03e6-4db5-93ac-3f275a88fa61-kube-api-access-z8grm\") pod \"redhat-operators-v6pfm\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.127723 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-catalog-content\") pod \"redhat-operators-v6pfm\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.127825 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-utilities\") pod \"redhat-operators-v6pfm\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.229021 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-catalog-content\") pod \"redhat-operators-v6pfm\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.229105 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-utilities\") pod \"redhat-operators-v6pfm\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.229195 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8grm\" (UniqueName: \"kubernetes.io/projected/809cde9f-03e6-4db5-93ac-3f275a88fa61-kube-api-access-z8grm\") pod \"redhat-operators-v6pfm\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.231293 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-catalog-content\") pod \"redhat-operators-v6pfm\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.231318 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-utilities\") pod \"redhat-operators-v6pfm\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.257476 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8grm\" (UniqueName: \"kubernetes.io/projected/809cde9f-03e6-4db5-93ac-3f275a88fa61-kube-api-access-z8grm\") pod \"redhat-operators-v6pfm\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:54:57 crc kubenswrapper[4475]: I1203 07:54:57.370364 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:54:58 crc kubenswrapper[4475]: I1203 07:54:58.133782 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v6pfm"] Dec 03 07:54:58 crc kubenswrapper[4475]: W1203 07:54:58.143697 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod809cde9f_03e6_4db5_93ac_3f275a88fa61.slice/crio-dabea60eb6052d8a603ba00d61d8574d5b7be66933623d8d0f66cad36b41f684 WatchSource:0}: Error finding container dabea60eb6052d8a603ba00d61d8574d5b7be66933623d8d0f66cad36b41f684: Status 404 returned error can't find the container with id dabea60eb6052d8a603ba00d61d8574d5b7be66933623d8d0f66cad36b41f684 Dec 03 07:54:58 crc kubenswrapper[4475]: I1203 07:54:58.699442 4475 generic.go:334] "Generic (PLEG): container finished" podID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerID="e14265f585739d746d80c8d083abc585012f1162e4ce2b07bd3deadaa4105433" exitCode=0 Dec 03 07:54:58 crc kubenswrapper[4475]: I1203 07:54:58.699896 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6pfm" event={"ID":"809cde9f-03e6-4db5-93ac-3f275a88fa61","Type":"ContainerDied","Data":"e14265f585739d746d80c8d083abc585012f1162e4ce2b07bd3deadaa4105433"} Dec 03 07:54:58 crc kubenswrapper[4475]: I1203 07:54:58.699922 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6pfm" event={"ID":"809cde9f-03e6-4db5-93ac-3f275a88fa61","Type":"ContainerStarted","Data":"dabea60eb6052d8a603ba00d61d8574d5b7be66933623d8d0f66cad36b41f684"} Dec 03 07:55:00 crc kubenswrapper[4475]: I1203 07:55:00.713862 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6pfm" event={"ID":"809cde9f-03e6-4db5-93ac-3f275a88fa61","Type":"ContainerStarted","Data":"c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791"} Dec 03 07:55:02 crc kubenswrapper[4475]: I1203 07:55:02.728164 4475 generic.go:334] "Generic (PLEG): container finished" podID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerID="c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791" exitCode=0 Dec 03 07:55:02 crc kubenswrapper[4475]: I1203 07:55:02.728242 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6pfm" event={"ID":"809cde9f-03e6-4db5-93ac-3f275a88fa61","Type":"ContainerDied","Data":"c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791"} Dec 03 07:55:03 crc kubenswrapper[4475]: I1203 07:55:03.736198 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6pfm" event={"ID":"809cde9f-03e6-4db5-93ac-3f275a88fa61","Type":"ContainerStarted","Data":"8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2"} Dec 03 07:55:03 crc kubenswrapper[4475]: I1203 07:55:03.755711 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v6pfm" podStartSLOduration=3.233162785 podStartE2EDuration="7.754778068s" podCreationTimestamp="2025-12-03 07:54:56 +0000 UTC" firstStartedPulling="2025-12-03 07:54:58.702869458 +0000 UTC m=+4183.507767792" lastFinishedPulling="2025-12-03 07:55:03.224484741 +0000 UTC m=+4188.029383075" observedRunningTime="2025-12-03 07:55:03.749111274 +0000 UTC m=+4188.554009598" watchObservedRunningTime="2025-12-03 07:55:03.754778068 +0000 UTC m=+4188.559676401" Dec 03 07:55:07 crc kubenswrapper[4475]: I1203 07:55:07.370959 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:55:07 crc kubenswrapper[4475]: I1203 07:55:07.371163 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:55:08 crc kubenswrapper[4475]: I1203 07:55:08.407013 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v6pfm" podUID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerName="registry-server" probeResult="failure" output=< Dec 03 07:55:08 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 07:55:08 crc kubenswrapper[4475]: > Dec 03 07:55:17 crc kubenswrapper[4475]: I1203 07:55:17.405921 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:55:17 crc kubenswrapper[4475]: I1203 07:55:17.449434 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:55:17 crc kubenswrapper[4475]: I1203 07:55:17.649434 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v6pfm"] Dec 03 07:55:18 crc kubenswrapper[4475]: I1203 07:55:18.833102 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v6pfm" podUID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerName="registry-server" containerID="cri-o://8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2" gracePeriod=2 Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.380904 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.403594 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-catalog-content\") pod \"809cde9f-03e6-4db5-93ac-3f275a88fa61\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.403695 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8grm\" (UniqueName: \"kubernetes.io/projected/809cde9f-03e6-4db5-93ac-3f275a88fa61-kube-api-access-z8grm\") pod \"809cde9f-03e6-4db5-93ac-3f275a88fa61\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.403797 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-utilities\") pod \"809cde9f-03e6-4db5-93ac-3f275a88fa61\" (UID: \"809cde9f-03e6-4db5-93ac-3f275a88fa61\") " Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.407021 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-utilities" (OuterVolumeSpecName: "utilities") pod "809cde9f-03e6-4db5-93ac-3f275a88fa61" (UID: "809cde9f-03e6-4db5-93ac-3f275a88fa61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.419987 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/809cde9f-03e6-4db5-93ac-3f275a88fa61-kube-api-access-z8grm" (OuterVolumeSpecName: "kube-api-access-z8grm") pod "809cde9f-03e6-4db5-93ac-3f275a88fa61" (UID: "809cde9f-03e6-4db5-93ac-3f275a88fa61"). InnerVolumeSpecName "kube-api-access-z8grm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.503689 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "809cde9f-03e6-4db5-93ac-3f275a88fa61" (UID: "809cde9f-03e6-4db5-93ac-3f275a88fa61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.506070 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.506092 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/809cde9f-03e6-4db5-93ac-3f275a88fa61-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.506103 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8grm\" (UniqueName: \"kubernetes.io/projected/809cde9f-03e6-4db5-93ac-3f275a88fa61-kube-api-access-z8grm\") on node \"crc\" DevicePath \"\"" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.839391 4475 generic.go:334] "Generic (PLEG): container finished" podID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerID="8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2" exitCode=0 Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.839474 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v6pfm" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.839493 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6pfm" event={"ID":"809cde9f-03e6-4db5-93ac-3f275a88fa61","Type":"ContainerDied","Data":"8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2"} Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.840154 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6pfm" event={"ID":"809cde9f-03e6-4db5-93ac-3f275a88fa61","Type":"ContainerDied","Data":"dabea60eb6052d8a603ba00d61d8574d5b7be66933623d8d0f66cad36b41f684"} Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.840777 4475 scope.go:117] "RemoveContainer" containerID="8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.873133 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v6pfm"] Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.874367 4475 scope.go:117] "RemoveContainer" containerID="c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.879928 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v6pfm"] Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.893969 4475 scope.go:117] "RemoveContainer" containerID="e14265f585739d746d80c8d083abc585012f1162e4ce2b07bd3deadaa4105433" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.925517 4475 scope.go:117] "RemoveContainer" containerID="8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2" Dec 03 07:55:19 crc kubenswrapper[4475]: E1203 07:55:19.927412 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2\": container with ID starting with 8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2 not found: ID does not exist" containerID="8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.927856 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2"} err="failed to get container status \"8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2\": rpc error: code = NotFound desc = could not find container \"8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2\": container with ID starting with 8ee6c4eac472f4f2ba40b5e66c12b18bb117e01c7d4c645ff8993119037956c2 not found: ID does not exist" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.927893 4475 scope.go:117] "RemoveContainer" containerID="c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791" Dec 03 07:55:19 crc kubenswrapper[4475]: E1203 07:55:19.928390 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791\": container with ID starting with c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791 not found: ID does not exist" containerID="c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.928427 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791"} err="failed to get container status \"c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791\": rpc error: code = NotFound desc = could not find container \"c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791\": container with ID starting with c1c97407811078c0e266fdd1e0f736aac57aa981787730ff33a05e43cb25f791 not found: ID does not exist" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.928442 4475 scope.go:117] "RemoveContainer" containerID="e14265f585739d746d80c8d083abc585012f1162e4ce2b07bd3deadaa4105433" Dec 03 07:55:19 crc kubenswrapper[4475]: E1203 07:55:19.928856 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e14265f585739d746d80c8d083abc585012f1162e4ce2b07bd3deadaa4105433\": container with ID starting with e14265f585739d746d80c8d083abc585012f1162e4ce2b07bd3deadaa4105433 not found: ID does not exist" containerID="e14265f585739d746d80c8d083abc585012f1162e4ce2b07bd3deadaa4105433" Dec 03 07:55:19 crc kubenswrapper[4475]: I1203 07:55:19.928877 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e14265f585739d746d80c8d083abc585012f1162e4ce2b07bd3deadaa4105433"} err="failed to get container status \"e14265f585739d746d80c8d083abc585012f1162e4ce2b07bd3deadaa4105433\": rpc error: code = NotFound desc = could not find container \"e14265f585739d746d80c8d083abc585012f1162e4ce2b07bd3deadaa4105433\": container with ID starting with e14265f585739d746d80c8d083abc585012f1162e4ce2b07bd3deadaa4105433 not found: ID does not exist" Dec 03 07:55:21 crc kubenswrapper[4475]: I1203 07:55:21.498928 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="809cde9f-03e6-4db5-93ac-3f275a88fa61" path="/var/lib/kubelet/pods/809cde9f-03e6-4db5-93ac-3f275a88fa61/volumes" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.055398 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-72d6l"] Dec 03 07:55:50 crc kubenswrapper[4475]: E1203 07:55:50.057218 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerName="extract-utilities" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.057331 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerName="extract-utilities" Dec 03 07:55:50 crc kubenswrapper[4475]: E1203 07:55:50.057393 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerName="extract-content" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.057524 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerName="extract-content" Dec 03 07:55:50 crc kubenswrapper[4475]: E1203 07:55:50.057620 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerName="registry-server" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.057670 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerName="registry-server" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.058416 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="809cde9f-03e6-4db5-93ac-3f275a88fa61" containerName="registry-server" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.060665 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.062861 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-72d6l"] Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.187271 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-catalog-content\") pod \"community-operators-72d6l\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.187471 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2rbz\" (UniqueName: \"kubernetes.io/projected/1c8236de-45ea-46d1-9813-2c73458866a1-kube-api-access-l2rbz\") pod \"community-operators-72d6l\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.187743 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-utilities\") pod \"community-operators-72d6l\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.288744 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-catalog-content\") pod \"community-operators-72d6l\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.288816 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2rbz\" (UniqueName: \"kubernetes.io/projected/1c8236de-45ea-46d1-9813-2c73458866a1-kube-api-access-l2rbz\") pod \"community-operators-72d6l\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.288909 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-utilities\") pod \"community-operators-72d6l\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.289118 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-catalog-content\") pod \"community-operators-72d6l\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.289224 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-utilities\") pod \"community-operators-72d6l\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.307174 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2rbz\" (UniqueName: \"kubernetes.io/projected/1c8236de-45ea-46d1-9813-2c73458866a1-kube-api-access-l2rbz\") pod \"community-operators-72d6l\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.387663 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:55:50 crc kubenswrapper[4475]: I1203 07:55:50.763037 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-72d6l"] Dec 03 07:55:50 crc kubenswrapper[4475]: W1203 07:55:50.767727 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c8236de_45ea_46d1_9813_2c73458866a1.slice/crio-84bb18349b31b72d70133047c831eb0244e1e0ec61e8ecd7e35ca5d0e77c4494 WatchSource:0}: Error finding container 84bb18349b31b72d70133047c831eb0244e1e0ec61e8ecd7e35ca5d0e77c4494: Status 404 returned error can't find the container with id 84bb18349b31b72d70133047c831eb0244e1e0ec61e8ecd7e35ca5d0e77c4494 Dec 03 07:55:51 crc kubenswrapper[4475]: I1203 07:55:51.046257 4475 generic.go:334] "Generic (PLEG): container finished" podID="1c8236de-45ea-46d1-9813-2c73458866a1" containerID="230711e06da86dff57ba320619dfa5914f03d88c2c3dd20bc2eb47d1fb5877df" exitCode=0 Dec 03 07:55:51 crc kubenswrapper[4475]: I1203 07:55:51.046500 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d6l" event={"ID":"1c8236de-45ea-46d1-9813-2c73458866a1","Type":"ContainerDied","Data":"230711e06da86dff57ba320619dfa5914f03d88c2c3dd20bc2eb47d1fb5877df"} Dec 03 07:55:51 crc kubenswrapper[4475]: I1203 07:55:51.046543 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d6l" event={"ID":"1c8236de-45ea-46d1-9813-2c73458866a1","Type":"ContainerStarted","Data":"84bb18349b31b72d70133047c831eb0244e1e0ec61e8ecd7e35ca5d0e77c4494"} Dec 03 07:55:51 crc kubenswrapper[4475]: I1203 07:55:51.048298 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:55:52 crc kubenswrapper[4475]: I1203 07:55:52.057401 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d6l" event={"ID":"1c8236de-45ea-46d1-9813-2c73458866a1","Type":"ContainerStarted","Data":"0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854"} Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.043813 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cw4bf"] Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.047147 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.053252 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cw4bf"] Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.065153 4475 generic.go:334] "Generic (PLEG): container finished" podID="1c8236de-45ea-46d1-9813-2c73458866a1" containerID="0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854" exitCode=0 Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.065187 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d6l" event={"ID":"1c8236de-45ea-46d1-9813-2c73458866a1","Type":"ContainerDied","Data":"0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854"} Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.146560 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4226701f-94ab-4d80-a3ea-d9d7a7f29fab-catalog-content\") pod \"certified-operators-cw4bf\" (UID: \"4226701f-94ab-4d80-a3ea-d9d7a7f29fab\") " pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.146759 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4226701f-94ab-4d80-a3ea-d9d7a7f29fab-utilities\") pod \"certified-operators-cw4bf\" (UID: \"4226701f-94ab-4d80-a3ea-d9d7a7f29fab\") " pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.146930 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpn8p\" (UniqueName: \"kubernetes.io/projected/4226701f-94ab-4d80-a3ea-d9d7a7f29fab-kube-api-access-kpn8p\") pod \"certified-operators-cw4bf\" (UID: \"4226701f-94ab-4d80-a3ea-d9d7a7f29fab\") " pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.248736 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4226701f-94ab-4d80-a3ea-d9d7a7f29fab-utilities\") pod \"certified-operators-cw4bf\" (UID: \"4226701f-94ab-4d80-a3ea-d9d7a7f29fab\") " pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.248949 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpn8p\" (UniqueName: \"kubernetes.io/projected/4226701f-94ab-4d80-a3ea-d9d7a7f29fab-kube-api-access-kpn8p\") pod \"certified-operators-cw4bf\" (UID: \"4226701f-94ab-4d80-a3ea-d9d7a7f29fab\") " pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.249068 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4226701f-94ab-4d80-a3ea-d9d7a7f29fab-catalog-content\") pod \"certified-operators-cw4bf\" (UID: \"4226701f-94ab-4d80-a3ea-d9d7a7f29fab\") " pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.249161 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4226701f-94ab-4d80-a3ea-d9d7a7f29fab-utilities\") pod \"certified-operators-cw4bf\" (UID: \"4226701f-94ab-4d80-a3ea-d9d7a7f29fab\") " pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.249318 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4226701f-94ab-4d80-a3ea-d9d7a7f29fab-catalog-content\") pod \"certified-operators-cw4bf\" (UID: \"4226701f-94ab-4d80-a3ea-d9d7a7f29fab\") " pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.264851 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpn8p\" (UniqueName: \"kubernetes.io/projected/4226701f-94ab-4d80-a3ea-d9d7a7f29fab-kube-api-access-kpn8p\") pod \"certified-operators-cw4bf\" (UID: \"4226701f-94ab-4d80-a3ea-d9d7a7f29fab\") " pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.360924 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:55:53 crc kubenswrapper[4475]: I1203 07:55:53.762714 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cw4bf"] Dec 03 07:55:54 crc kubenswrapper[4475]: I1203 07:55:54.075569 4475 generic.go:334] "Generic (PLEG): container finished" podID="4226701f-94ab-4d80-a3ea-d9d7a7f29fab" containerID="8176bafa7bad1a8f5a6d4da78272f8ebf7c6104656f73d670b444bbc03bb2392" exitCode=0 Dec 03 07:55:54 crc kubenswrapper[4475]: I1203 07:55:54.075617 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw4bf" event={"ID":"4226701f-94ab-4d80-a3ea-d9d7a7f29fab","Type":"ContainerDied","Data":"8176bafa7bad1a8f5a6d4da78272f8ebf7c6104656f73d670b444bbc03bb2392"} Dec 03 07:55:54 crc kubenswrapper[4475]: I1203 07:55:54.075639 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw4bf" event={"ID":"4226701f-94ab-4d80-a3ea-d9d7a7f29fab","Type":"ContainerStarted","Data":"41837cbce5f9d89283cf9750efdbad70200f21d3a00525dd707c42c7d34468d9"} Dec 03 07:55:54 crc kubenswrapper[4475]: I1203 07:55:54.080086 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d6l" event={"ID":"1c8236de-45ea-46d1-9813-2c73458866a1","Type":"ContainerStarted","Data":"c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d"} Dec 03 07:55:54 crc kubenswrapper[4475]: I1203 07:55:54.104213 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-72d6l" podStartSLOduration=1.6471618559999999 podStartE2EDuration="4.104201026s" podCreationTimestamp="2025-12-03 07:55:50 +0000 UTC" firstStartedPulling="2025-12-03 07:55:51.047738961 +0000 UTC m=+4235.852637295" lastFinishedPulling="2025-12-03 07:55:53.504778131 +0000 UTC m=+4238.309676465" observedRunningTime="2025-12-03 07:55:54.102539522 +0000 UTC m=+4238.907437856" watchObservedRunningTime="2025-12-03 07:55:54.104201026 +0000 UTC m=+4238.909099360" Dec 03 07:56:00 crc kubenswrapper[4475]: I1203 07:56:00.126923 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw4bf" event={"ID":"4226701f-94ab-4d80-a3ea-d9d7a7f29fab","Type":"ContainerStarted","Data":"a968260d7f080204dcd6a8302e2edb40c282ae8ab6d3f563db60c0efcf96b7fb"} Dec 03 07:56:00 crc kubenswrapper[4475]: I1203 07:56:00.388395 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:56:00 crc kubenswrapper[4475]: I1203 07:56:00.388530 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:56:00 crc kubenswrapper[4475]: I1203 07:56:00.427368 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:56:01 crc kubenswrapper[4475]: I1203 07:56:01.135365 4475 generic.go:334] "Generic (PLEG): container finished" podID="4226701f-94ab-4d80-a3ea-d9d7a7f29fab" containerID="a968260d7f080204dcd6a8302e2edb40c282ae8ab6d3f563db60c0efcf96b7fb" exitCode=0 Dec 03 07:56:01 crc kubenswrapper[4475]: I1203 07:56:01.135486 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw4bf" event={"ID":"4226701f-94ab-4d80-a3ea-d9d7a7f29fab","Type":"ContainerDied","Data":"a968260d7f080204dcd6a8302e2edb40c282ae8ab6d3f563db60c0efcf96b7fb"} Dec 03 07:56:01 crc kubenswrapper[4475]: I1203 07:56:01.175671 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:56:02 crc kubenswrapper[4475]: I1203 07:56:02.147822 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw4bf" event={"ID":"4226701f-94ab-4d80-a3ea-d9d7a7f29fab","Type":"ContainerStarted","Data":"de6100c707aaaeebaeec2ae9ff9e5d6f12f8d4a42c422889063e3a0ccdf81049"} Dec 03 07:56:02 crc kubenswrapper[4475]: I1203 07:56:02.163790 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cw4bf" podStartSLOduration=1.6391679510000001 podStartE2EDuration="9.163776835s" podCreationTimestamp="2025-12-03 07:55:53 +0000 UTC" firstStartedPulling="2025-12-03 07:55:54.077249446 +0000 UTC m=+4238.882147779" lastFinishedPulling="2025-12-03 07:56:01.601858329 +0000 UTC m=+4246.406756663" observedRunningTime="2025-12-03 07:56:02.159521004 +0000 UTC m=+4246.964419338" watchObservedRunningTime="2025-12-03 07:56:02.163776835 +0000 UTC m=+4246.968675169" Dec 03 07:56:02 crc kubenswrapper[4475]: I1203 07:56:02.355933 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-72d6l"] Dec 03 07:56:03 crc kubenswrapper[4475]: I1203 07:56:03.361789 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:56:03 crc kubenswrapper[4475]: I1203 07:56:03.362434 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.159966 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-72d6l" podUID="1c8236de-45ea-46d1-9813-2c73458866a1" containerName="registry-server" containerID="cri-o://c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d" gracePeriod=2 Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.396596 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-cw4bf" podUID="4226701f-94ab-4d80-a3ea-d9d7a7f29fab" containerName="registry-server" probeResult="failure" output=< Dec 03 07:56:04 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 07:56:04 crc kubenswrapper[4475]: > Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.583495 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.655215 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-catalog-content\") pod \"1c8236de-45ea-46d1-9813-2c73458866a1\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.655371 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2rbz\" (UniqueName: \"kubernetes.io/projected/1c8236de-45ea-46d1-9813-2c73458866a1-kube-api-access-l2rbz\") pod \"1c8236de-45ea-46d1-9813-2c73458866a1\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.655400 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-utilities\") pod \"1c8236de-45ea-46d1-9813-2c73458866a1\" (UID: \"1c8236de-45ea-46d1-9813-2c73458866a1\") " Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.655881 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-utilities" (OuterVolumeSpecName: "utilities") pod "1c8236de-45ea-46d1-9813-2c73458866a1" (UID: "1c8236de-45ea-46d1-9813-2c73458866a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.659832 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c8236de-45ea-46d1-9813-2c73458866a1-kube-api-access-l2rbz" (OuterVolumeSpecName: "kube-api-access-l2rbz") pod "1c8236de-45ea-46d1-9813-2c73458866a1" (UID: "1c8236de-45ea-46d1-9813-2c73458866a1"). InnerVolumeSpecName "kube-api-access-l2rbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.692543 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c8236de-45ea-46d1-9813-2c73458866a1" (UID: "1c8236de-45ea-46d1-9813-2c73458866a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.757358 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2rbz\" (UniqueName: \"kubernetes.io/projected/1c8236de-45ea-46d1-9813-2c73458866a1-kube-api-access-l2rbz\") on node \"crc\" DevicePath \"\"" Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.757385 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:56:04 crc kubenswrapper[4475]: I1203 07:56:04.757396 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8236de-45ea-46d1-9813-2c73458866a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.167190 4475 generic.go:334] "Generic (PLEG): container finished" podID="1c8236de-45ea-46d1-9813-2c73458866a1" containerID="c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d" exitCode=0 Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.167224 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72d6l" Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.167239 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d6l" event={"ID":"1c8236de-45ea-46d1-9813-2c73458866a1","Type":"ContainerDied","Data":"c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d"} Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.167500 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d6l" event={"ID":"1c8236de-45ea-46d1-9813-2c73458866a1","Type":"ContainerDied","Data":"84bb18349b31b72d70133047c831eb0244e1e0ec61e8ecd7e35ca5d0e77c4494"} Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.167518 4475 scope.go:117] "RemoveContainer" containerID="c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d" Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.185656 4475 scope.go:117] "RemoveContainer" containerID="0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854" Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.190560 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-72d6l"] Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.198542 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-72d6l"] Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.206346 4475 scope.go:117] "RemoveContainer" containerID="230711e06da86dff57ba320619dfa5914f03d88c2c3dd20bc2eb47d1fb5877df" Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.237226 4475 scope.go:117] "RemoveContainer" containerID="c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d" Dec 03 07:56:05 crc kubenswrapper[4475]: E1203 07:56:05.237592 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d\": container with ID starting with c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d not found: ID does not exist" containerID="c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d" Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.237622 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d"} err="failed to get container status \"c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d\": rpc error: code = NotFound desc = could not find container \"c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d\": container with ID starting with c4b17c1e266daaa0557652fdc37ebae5157559d6fbaf3c653dbe7d00b638b80d not found: ID does not exist" Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.237643 4475 scope.go:117] "RemoveContainer" containerID="0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854" Dec 03 07:56:05 crc kubenswrapper[4475]: E1203 07:56:05.237991 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854\": container with ID starting with 0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854 not found: ID does not exist" containerID="0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854" Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.238018 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854"} err="failed to get container status \"0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854\": rpc error: code = NotFound desc = could not find container \"0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854\": container with ID starting with 0a63fbc83d0901ee77b99f310d323d694830e96b04001987160c20818e295854 not found: ID does not exist" Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.238040 4475 scope.go:117] "RemoveContainer" containerID="230711e06da86dff57ba320619dfa5914f03d88c2c3dd20bc2eb47d1fb5877df" Dec 03 07:56:05 crc kubenswrapper[4475]: E1203 07:56:05.238328 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230711e06da86dff57ba320619dfa5914f03d88c2c3dd20bc2eb47d1fb5877df\": container with ID starting with 230711e06da86dff57ba320619dfa5914f03d88c2c3dd20bc2eb47d1fb5877df not found: ID does not exist" containerID="230711e06da86dff57ba320619dfa5914f03d88c2c3dd20bc2eb47d1fb5877df" Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.238348 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230711e06da86dff57ba320619dfa5914f03d88c2c3dd20bc2eb47d1fb5877df"} err="failed to get container status \"230711e06da86dff57ba320619dfa5914f03d88c2c3dd20bc2eb47d1fb5877df\": rpc error: code = NotFound desc = could not find container \"230711e06da86dff57ba320619dfa5914f03d88c2c3dd20bc2eb47d1fb5877df\": container with ID starting with 230711e06da86dff57ba320619dfa5914f03d88c2c3dd20bc2eb47d1fb5877df not found: ID does not exist" Dec 03 07:56:05 crc kubenswrapper[4475]: I1203 07:56:05.500312 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c8236de-45ea-46d1-9813-2c73458866a1" path="/var/lib/kubelet/pods/1c8236de-45ea-46d1-9813-2c73458866a1/volumes" Dec 03 07:56:13 crc kubenswrapper[4475]: I1203 07:56:13.396506 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:56:13 crc kubenswrapper[4475]: I1203 07:56:13.436899 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cw4bf" Dec 03 07:56:13 crc kubenswrapper[4475]: I1203 07:56:13.502063 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cw4bf"] Dec 03 07:56:13 crc kubenswrapper[4475]: I1203 07:56:13.624131 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gdfwt"] Dec 03 07:56:13 crc kubenswrapper[4475]: I1203 07:56:13.624324 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gdfwt" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" containerName="registry-server" containerID="cri-o://6d7a30030a316444407a9c792ae2b81316fc2d6f2c38f2f26fc4f915e135e6a4" gracePeriod=2 Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.224745 4475 generic.go:334] "Generic (PLEG): container finished" podID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" containerID="6d7a30030a316444407a9c792ae2b81316fc2d6f2c38f2f26fc4f915e135e6a4" exitCode=0 Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.225331 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdfwt" event={"ID":"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057","Type":"ContainerDied","Data":"6d7a30030a316444407a9c792ae2b81316fc2d6f2c38f2f26fc4f915e135e6a4"} Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.513908 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.609797 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-utilities\") pod \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.609830 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-catalog-content\") pod \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.609894 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb6wh\" (UniqueName: \"kubernetes.io/projected/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-kube-api-access-lb6wh\") pod \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\" (UID: \"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057\") " Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.610653 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-utilities" (OuterVolumeSpecName: "utilities") pod "883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" (UID: "883b2dc3-6da2-4b66-b7e7-d8f7da0f6057"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.611799 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.648419 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-kube-api-access-lb6wh" (OuterVolumeSpecName: "kube-api-access-lb6wh") pod "883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" (UID: "883b2dc3-6da2-4b66-b7e7-d8f7da0f6057"). InnerVolumeSpecName "kube-api-access-lb6wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.690660 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" (UID: "883b2dc3-6da2-4b66-b7e7-d8f7da0f6057"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.713904 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:56:14 crc kubenswrapper[4475]: I1203 07:56:14.713931 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb6wh\" (UniqueName: \"kubernetes.io/projected/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057-kube-api-access-lb6wh\") on node \"crc\" DevicePath \"\"" Dec 03 07:56:15 crc kubenswrapper[4475]: I1203 07:56:15.233295 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdfwt" Dec 03 07:56:15 crc kubenswrapper[4475]: I1203 07:56:15.233929 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdfwt" event={"ID":"883b2dc3-6da2-4b66-b7e7-d8f7da0f6057","Type":"ContainerDied","Data":"1809e8178c41f89736e0c5e1fdd5d22819ca8417375a1b0db967423ded4c63fe"} Dec 03 07:56:15 crc kubenswrapper[4475]: I1203 07:56:15.233990 4475 scope.go:117] "RemoveContainer" containerID="6d7a30030a316444407a9c792ae2b81316fc2d6f2c38f2f26fc4f915e135e6a4" Dec 03 07:56:15 crc kubenswrapper[4475]: I1203 07:56:15.253235 4475 scope.go:117] "RemoveContainer" containerID="bbce2bda2f220b0d5016adce6fbfe8a2362648a0dca7c52d02760f6db39096eb" Dec 03 07:56:15 crc kubenswrapper[4475]: I1203 07:56:15.263597 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gdfwt"] Dec 03 07:56:15 crc kubenswrapper[4475]: I1203 07:56:15.268342 4475 scope.go:117] "RemoveContainer" containerID="76398f13e878594524d44423553f9c1cdc5cd421daf181f5b4f467b4f440350e" Dec 03 07:56:15 crc kubenswrapper[4475]: I1203 07:56:15.268961 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gdfwt"] Dec 03 07:56:15 crc kubenswrapper[4475]: I1203 07:56:15.499749 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" path="/var/lib/kubelet/pods/883b2dc3-6da2-4b66-b7e7-d8f7da0f6057/volumes" Dec 03 07:56:58 crc kubenswrapper[4475]: I1203 07:56:58.933727 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:56:58 crc kubenswrapper[4475]: I1203 07:56:58.934155 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:57:28 crc kubenswrapper[4475]: I1203 07:57:28.933797 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:57:28 crc kubenswrapper[4475]: I1203 07:57:28.934179 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:57:58 crc kubenswrapper[4475]: I1203 07:57:58.933246 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:57:58 crc kubenswrapper[4475]: I1203 07:57:58.933642 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:57:58 crc kubenswrapper[4475]: I1203 07:57:58.933675 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 07:57:58 crc kubenswrapper[4475]: I1203 07:57:58.934297 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5fd6239910e877dca8b0b4140c098fd399692464452ceed3964f6b7732bcf773"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:57:58 crc kubenswrapper[4475]: I1203 07:57:58.934347 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://5fd6239910e877dca8b0b4140c098fd399692464452ceed3964f6b7732bcf773" gracePeriod=600 Dec 03 07:57:59 crc kubenswrapper[4475]: I1203 07:57:59.893657 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="5fd6239910e877dca8b0b4140c098fd399692464452ceed3964f6b7732bcf773" exitCode=0 Dec 03 07:57:59 crc kubenswrapper[4475]: I1203 07:57:59.893688 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"5fd6239910e877dca8b0b4140c098fd399692464452ceed3964f6b7732bcf773"} Dec 03 07:57:59 crc kubenswrapper[4475]: I1203 07:57:59.894021 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e"} Dec 03 07:57:59 crc kubenswrapper[4475]: I1203 07:57:59.894053 4475 scope.go:117] "RemoveContainer" containerID="7b0a38f9c544d8a2f59b2ca402561211c4a545eafe71b51ee524168e165ff3e4" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.157119 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl"] Dec 03 08:00:00 crc kubenswrapper[4475]: E1203 08:00:00.157846 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8236de-45ea-46d1-9813-2c73458866a1" containerName="extract-content" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.157858 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8236de-45ea-46d1-9813-2c73458866a1" containerName="extract-content" Dec 03 08:00:00 crc kubenswrapper[4475]: E1203 08:00:00.157870 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" containerName="extract-utilities" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.157876 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" containerName="extract-utilities" Dec 03 08:00:00 crc kubenswrapper[4475]: E1203 08:00:00.157885 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" containerName="extract-content" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.157891 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" containerName="extract-content" Dec 03 08:00:00 crc kubenswrapper[4475]: E1203 08:00:00.157901 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8236de-45ea-46d1-9813-2c73458866a1" containerName="extract-utilities" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.157906 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8236de-45ea-46d1-9813-2c73458866a1" containerName="extract-utilities" Dec 03 08:00:00 crc kubenswrapper[4475]: E1203 08:00:00.157921 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8236de-45ea-46d1-9813-2c73458866a1" containerName="registry-server" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.157926 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8236de-45ea-46d1-9813-2c73458866a1" containerName="registry-server" Dec 03 08:00:00 crc kubenswrapper[4475]: E1203 08:00:00.157953 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" containerName="registry-server" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.157958 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" containerName="registry-server" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.158122 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="883b2dc3-6da2-4b66-b7e7-d8f7da0f6057" containerName="registry-server" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.158135 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c8236de-45ea-46d1-9813-2c73458866a1" containerName="registry-server" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.158646 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.164950 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.164963 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.168723 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl"] Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.178738 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2ec5724-5d40-47a7-b078-cba9149cd04d-secret-volume\") pod \"collect-profiles-29412480-5ctgl\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.178805 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59m4r\" (UniqueName: \"kubernetes.io/projected/c2ec5724-5d40-47a7-b078-cba9149cd04d-kube-api-access-59m4r\") pod \"collect-profiles-29412480-5ctgl\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.178910 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ec5724-5d40-47a7-b078-cba9149cd04d-config-volume\") pod \"collect-profiles-29412480-5ctgl\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.280589 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2ec5724-5d40-47a7-b078-cba9149cd04d-secret-volume\") pod \"collect-profiles-29412480-5ctgl\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.280647 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59m4r\" (UniqueName: \"kubernetes.io/projected/c2ec5724-5d40-47a7-b078-cba9149cd04d-kube-api-access-59m4r\") pod \"collect-profiles-29412480-5ctgl\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.280723 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ec5724-5d40-47a7-b078-cba9149cd04d-config-volume\") pod \"collect-profiles-29412480-5ctgl\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.281551 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ec5724-5d40-47a7-b078-cba9149cd04d-config-volume\") pod \"collect-profiles-29412480-5ctgl\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.288047 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2ec5724-5d40-47a7-b078-cba9149cd04d-secret-volume\") pod \"collect-profiles-29412480-5ctgl\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.295988 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59m4r\" (UniqueName: \"kubernetes.io/projected/c2ec5724-5d40-47a7-b078-cba9149cd04d-kube-api-access-59m4r\") pod \"collect-profiles-29412480-5ctgl\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.476312 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:00 crc kubenswrapper[4475]: I1203 08:00:00.917578 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl"] Dec 03 08:00:01 crc kubenswrapper[4475]: I1203 08:00:01.702078 4475 generic.go:334] "Generic (PLEG): container finished" podID="c2ec5724-5d40-47a7-b078-cba9149cd04d" containerID="6e59452431e50843d5c96a8044c8f897bc8c3f8f85a1bd135150d1d548e6d7e2" exitCode=0 Dec 03 08:00:01 crc kubenswrapper[4475]: I1203 08:00:01.702174 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" event={"ID":"c2ec5724-5d40-47a7-b078-cba9149cd04d","Type":"ContainerDied","Data":"6e59452431e50843d5c96a8044c8f897bc8c3f8f85a1bd135150d1d548e6d7e2"} Dec 03 08:00:01 crc kubenswrapper[4475]: I1203 08:00:01.702397 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" event={"ID":"c2ec5724-5d40-47a7-b078-cba9149cd04d","Type":"ContainerStarted","Data":"d4f3c7ea8d7c7b9421f5ff023d8b0b81e33da54e6fb0db2db8bec76e7a52d66b"} Dec 03 08:00:02 crc kubenswrapper[4475]: I1203 08:00:02.996230 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.138710 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2ec5724-5d40-47a7-b078-cba9149cd04d-secret-volume\") pod \"c2ec5724-5d40-47a7-b078-cba9149cd04d\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.138970 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59m4r\" (UniqueName: \"kubernetes.io/projected/c2ec5724-5d40-47a7-b078-cba9149cd04d-kube-api-access-59m4r\") pod \"c2ec5724-5d40-47a7-b078-cba9149cd04d\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.139070 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ec5724-5d40-47a7-b078-cba9149cd04d-config-volume\") pod \"c2ec5724-5d40-47a7-b078-cba9149cd04d\" (UID: \"c2ec5724-5d40-47a7-b078-cba9149cd04d\") " Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.139947 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ec5724-5d40-47a7-b078-cba9149cd04d-config-volume" (OuterVolumeSpecName: "config-volume") pod "c2ec5724-5d40-47a7-b078-cba9149cd04d" (UID: "c2ec5724-5d40-47a7-b078-cba9149cd04d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.144721 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ec5724-5d40-47a7-b078-cba9149cd04d-kube-api-access-59m4r" (OuterVolumeSpecName: "kube-api-access-59m4r") pod "c2ec5724-5d40-47a7-b078-cba9149cd04d" (UID: "c2ec5724-5d40-47a7-b078-cba9149cd04d"). InnerVolumeSpecName "kube-api-access-59m4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.146224 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ec5724-5d40-47a7-b078-cba9149cd04d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c2ec5724-5d40-47a7-b078-cba9149cd04d" (UID: "c2ec5724-5d40-47a7-b078-cba9149cd04d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.242077 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59m4r\" (UniqueName: \"kubernetes.io/projected/c2ec5724-5d40-47a7-b078-cba9149cd04d-kube-api-access-59m4r\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.242109 4475 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ec5724-5d40-47a7-b078-cba9149cd04d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.242119 4475 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2ec5724-5d40-47a7-b078-cba9149cd04d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.716318 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" event={"ID":"c2ec5724-5d40-47a7-b078-cba9149cd04d","Type":"ContainerDied","Data":"d4f3c7ea8d7c7b9421f5ff023d8b0b81e33da54e6fb0db2db8bec76e7a52d66b"} Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.716361 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f3c7ea8d7c7b9421f5ff023d8b0b81e33da54e6fb0db2db8bec76e7a52d66b" Dec 03 08:00:03 crc kubenswrapper[4475]: I1203 08:00:03.716422 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl" Dec 03 08:00:04 crc kubenswrapper[4475]: I1203 08:00:04.063256 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs"] Dec 03 08:00:04 crc kubenswrapper[4475]: I1203 08:00:04.067931 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-rzbxs"] Dec 03 08:00:05 crc kubenswrapper[4475]: I1203 08:00:05.501525 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa7323af-2ea5-42bf-8a4b-2ddebc8d768a" path="/var/lib/kubelet/pods/fa7323af-2ea5-42bf-8a4b-2ddebc8d768a/volumes" Dec 03 08:00:22 crc kubenswrapper[4475]: I1203 08:00:22.961841 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jfgp6"] Dec 03 08:00:22 crc kubenswrapper[4475]: E1203 08:00:22.962602 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ec5724-5d40-47a7-b078-cba9149cd04d" containerName="collect-profiles" Dec 03 08:00:22 crc kubenswrapper[4475]: I1203 08:00:22.962615 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ec5724-5d40-47a7-b078-cba9149cd04d" containerName="collect-profiles" Dec 03 08:00:22 crc kubenswrapper[4475]: I1203 08:00:22.962818 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ec5724-5d40-47a7-b078-cba9149cd04d" containerName="collect-profiles" Dec 03 08:00:22 crc kubenswrapper[4475]: I1203 08:00:22.964075 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:22 crc kubenswrapper[4475]: I1203 08:00:22.970939 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfgp6"] Dec 03 08:00:23 crc kubenswrapper[4475]: I1203 08:00:23.020969 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-catalog-content\") pod \"redhat-marketplace-jfgp6\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:23 crc kubenswrapper[4475]: I1203 08:00:23.021435 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-utilities\") pod \"redhat-marketplace-jfgp6\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:23 crc kubenswrapper[4475]: I1203 08:00:23.021554 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtv4h\" (UniqueName: \"kubernetes.io/projected/f37b6ffc-7161-4404-942e-eeec3fda948a-kube-api-access-dtv4h\") pod \"redhat-marketplace-jfgp6\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:23 crc kubenswrapper[4475]: I1203 08:00:23.123011 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-utilities\") pod \"redhat-marketplace-jfgp6\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:23 crc kubenswrapper[4475]: I1203 08:00:23.123066 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtv4h\" (UniqueName: \"kubernetes.io/projected/f37b6ffc-7161-4404-942e-eeec3fda948a-kube-api-access-dtv4h\") pod \"redhat-marketplace-jfgp6\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:23 crc kubenswrapper[4475]: I1203 08:00:23.123099 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-catalog-content\") pod \"redhat-marketplace-jfgp6\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:23 crc kubenswrapper[4475]: I1203 08:00:23.123438 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-utilities\") pod \"redhat-marketplace-jfgp6\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:23 crc kubenswrapper[4475]: I1203 08:00:23.123533 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-catalog-content\") pod \"redhat-marketplace-jfgp6\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:23 crc kubenswrapper[4475]: I1203 08:00:23.141029 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtv4h\" (UniqueName: \"kubernetes.io/projected/f37b6ffc-7161-4404-942e-eeec3fda948a-kube-api-access-dtv4h\") pod \"redhat-marketplace-jfgp6\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:23 crc kubenswrapper[4475]: I1203 08:00:23.277368 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:24 crc kubenswrapper[4475]: I1203 08:00:24.140997 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfgp6"] Dec 03 08:00:24 crc kubenswrapper[4475]: W1203 08:00:24.150248 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf37b6ffc_7161_4404_942e_eeec3fda948a.slice/crio-16e3ab00fa5a5265ae2e6a8673226567dc60e97a0c2f72bf09a4d8b01c1fcfa3 WatchSource:0}: Error finding container 16e3ab00fa5a5265ae2e6a8673226567dc60e97a0c2f72bf09a4d8b01c1fcfa3: Status 404 returned error can't find the container with id 16e3ab00fa5a5265ae2e6a8673226567dc60e97a0c2f72bf09a4d8b01c1fcfa3 Dec 03 08:00:24 crc kubenswrapper[4475]: I1203 08:00:24.850606 4475 generic.go:334] "Generic (PLEG): container finished" podID="f37b6ffc-7161-4404-942e-eeec3fda948a" containerID="0e8eebf1b35aad03c011208763797bed726dabe12cc6b400e3adfba30412b9cd" exitCode=0 Dec 03 08:00:24 crc kubenswrapper[4475]: I1203 08:00:24.850847 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfgp6" event={"ID":"f37b6ffc-7161-4404-942e-eeec3fda948a","Type":"ContainerDied","Data":"0e8eebf1b35aad03c011208763797bed726dabe12cc6b400e3adfba30412b9cd"} Dec 03 08:00:24 crc kubenswrapper[4475]: I1203 08:00:24.850873 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfgp6" event={"ID":"f37b6ffc-7161-4404-942e-eeec3fda948a","Type":"ContainerStarted","Data":"16e3ab00fa5a5265ae2e6a8673226567dc60e97a0c2f72bf09a4d8b01c1fcfa3"} Dec 03 08:00:25 crc kubenswrapper[4475]: I1203 08:00:25.859183 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfgp6" event={"ID":"f37b6ffc-7161-4404-942e-eeec3fda948a","Type":"ContainerStarted","Data":"fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4"} Dec 03 08:00:26 crc kubenswrapper[4475]: I1203 08:00:26.866988 4475 generic.go:334] "Generic (PLEG): container finished" podID="f37b6ffc-7161-4404-942e-eeec3fda948a" containerID="fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4" exitCode=0 Dec 03 08:00:26 crc kubenswrapper[4475]: I1203 08:00:26.867214 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfgp6" event={"ID":"f37b6ffc-7161-4404-942e-eeec3fda948a","Type":"ContainerDied","Data":"fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4"} Dec 03 08:00:28 crc kubenswrapper[4475]: I1203 08:00:28.883414 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfgp6" event={"ID":"f37b6ffc-7161-4404-942e-eeec3fda948a","Type":"ContainerStarted","Data":"4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714"} Dec 03 08:00:28 crc kubenswrapper[4475]: I1203 08:00:28.901112 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jfgp6" podStartSLOduration=4.03413164 podStartE2EDuration="6.901096809s" podCreationTimestamp="2025-12-03 08:00:22 +0000 UTC" firstStartedPulling="2025-12-03 08:00:24.852621681 +0000 UTC m=+4509.657520015" lastFinishedPulling="2025-12-03 08:00:27.71958685 +0000 UTC m=+4512.524485184" observedRunningTime="2025-12-03 08:00:28.895845706 +0000 UTC m=+4513.700744040" watchObservedRunningTime="2025-12-03 08:00:28.901096809 +0000 UTC m=+4513.705995143" Dec 03 08:00:28 crc kubenswrapper[4475]: I1203 08:00:28.933762 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:00:28 crc kubenswrapper[4475]: I1203 08:00:28.933807 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:00:29 crc kubenswrapper[4475]: I1203 08:00:29.918277 4475 scope.go:117] "RemoveContainer" containerID="87e3d0ebb402d584bab645096ae1ecfa5c68480559f38d9f1b94649a4623759f" Dec 03 08:00:33 crc kubenswrapper[4475]: I1203 08:00:33.278261 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:33 crc kubenswrapper[4475]: I1203 08:00:33.278695 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:33 crc kubenswrapper[4475]: I1203 08:00:33.316592 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:33 crc kubenswrapper[4475]: I1203 08:00:33.957995 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:33 crc kubenswrapper[4475]: I1203 08:00:33.998164 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfgp6"] Dec 03 08:00:35 crc kubenswrapper[4475]: I1203 08:00:35.935076 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jfgp6" podUID="f37b6ffc-7161-4404-942e-eeec3fda948a" containerName="registry-server" containerID="cri-o://4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714" gracePeriod=2 Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.384066 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.438258 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-utilities\") pod \"f37b6ffc-7161-4404-942e-eeec3fda948a\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.438334 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtv4h\" (UniqueName: \"kubernetes.io/projected/f37b6ffc-7161-4404-942e-eeec3fda948a-kube-api-access-dtv4h\") pod \"f37b6ffc-7161-4404-942e-eeec3fda948a\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.438400 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-catalog-content\") pod \"f37b6ffc-7161-4404-942e-eeec3fda948a\" (UID: \"f37b6ffc-7161-4404-942e-eeec3fda948a\") " Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.439232 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-utilities" (OuterVolumeSpecName: "utilities") pod "f37b6ffc-7161-4404-942e-eeec3fda948a" (UID: "f37b6ffc-7161-4404-942e-eeec3fda948a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.444947 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37b6ffc-7161-4404-942e-eeec3fda948a-kube-api-access-dtv4h" (OuterVolumeSpecName: "kube-api-access-dtv4h") pod "f37b6ffc-7161-4404-942e-eeec3fda948a" (UID: "f37b6ffc-7161-4404-942e-eeec3fda948a"). InnerVolumeSpecName "kube-api-access-dtv4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.453224 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f37b6ffc-7161-4404-942e-eeec3fda948a" (UID: "f37b6ffc-7161-4404-942e-eeec3fda948a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.540147 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtv4h\" (UniqueName: \"kubernetes.io/projected/f37b6ffc-7161-4404-942e-eeec3fda948a-kube-api-access-dtv4h\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.540175 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.540185 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37b6ffc-7161-4404-942e-eeec3fda948a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.943895 4475 generic.go:334] "Generic (PLEG): container finished" podID="f37b6ffc-7161-4404-942e-eeec3fda948a" containerID="4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714" exitCode=0 Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.943964 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jfgp6" Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.943988 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfgp6" event={"ID":"f37b6ffc-7161-4404-942e-eeec3fda948a","Type":"ContainerDied","Data":"4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714"} Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.944821 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfgp6" event={"ID":"f37b6ffc-7161-4404-942e-eeec3fda948a","Type":"ContainerDied","Data":"16e3ab00fa5a5265ae2e6a8673226567dc60e97a0c2f72bf09a4d8b01c1fcfa3"} Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.944874 4475 scope.go:117] "RemoveContainer" containerID="4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714" Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.972294 4475 scope.go:117] "RemoveContainer" containerID="fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4" Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.973054 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfgp6"] Dec 03 08:00:36 crc kubenswrapper[4475]: I1203 08:00:36.980311 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfgp6"] Dec 03 08:00:37 crc kubenswrapper[4475]: I1203 08:00:37.010331 4475 scope.go:117] "RemoveContainer" containerID="0e8eebf1b35aad03c011208763797bed726dabe12cc6b400e3adfba30412b9cd" Dec 03 08:00:37 crc kubenswrapper[4475]: I1203 08:00:37.033521 4475 scope.go:117] "RemoveContainer" containerID="4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714" Dec 03 08:00:37 crc kubenswrapper[4475]: E1203 08:00:37.033842 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714\": container with ID starting with 4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714 not found: ID does not exist" containerID="4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714" Dec 03 08:00:37 crc kubenswrapper[4475]: I1203 08:00:37.033870 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714"} err="failed to get container status \"4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714\": rpc error: code = NotFound desc = could not find container \"4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714\": container with ID starting with 4445d0bb1d2a6246dd48da0bd06d8825c60203d6cf3da9403fb775e9afc20714 not found: ID does not exist" Dec 03 08:00:37 crc kubenswrapper[4475]: I1203 08:00:37.033889 4475 scope.go:117] "RemoveContainer" containerID="fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4" Dec 03 08:00:37 crc kubenswrapper[4475]: E1203 08:00:37.034344 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4\": container with ID starting with fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4 not found: ID does not exist" containerID="fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4" Dec 03 08:00:37 crc kubenswrapper[4475]: I1203 08:00:37.034400 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4"} err="failed to get container status \"fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4\": rpc error: code = NotFound desc = could not find container \"fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4\": container with ID starting with fc1da3d68d903a763485c00d1ea90a27404d582d7d6adb84db5e07266330d3c4 not found: ID does not exist" Dec 03 08:00:37 crc kubenswrapper[4475]: I1203 08:00:37.034425 4475 scope.go:117] "RemoveContainer" containerID="0e8eebf1b35aad03c011208763797bed726dabe12cc6b400e3adfba30412b9cd" Dec 03 08:00:37 crc kubenswrapper[4475]: E1203 08:00:37.034936 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8eebf1b35aad03c011208763797bed726dabe12cc6b400e3adfba30412b9cd\": container with ID starting with 0e8eebf1b35aad03c011208763797bed726dabe12cc6b400e3adfba30412b9cd not found: ID does not exist" containerID="0e8eebf1b35aad03c011208763797bed726dabe12cc6b400e3adfba30412b9cd" Dec 03 08:00:37 crc kubenswrapper[4475]: I1203 08:00:37.034967 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8eebf1b35aad03c011208763797bed726dabe12cc6b400e3adfba30412b9cd"} err="failed to get container status \"0e8eebf1b35aad03c011208763797bed726dabe12cc6b400e3adfba30412b9cd\": rpc error: code = NotFound desc = could not find container \"0e8eebf1b35aad03c011208763797bed726dabe12cc6b400e3adfba30412b9cd\": container with ID starting with 0e8eebf1b35aad03c011208763797bed726dabe12cc6b400e3adfba30412b9cd not found: ID does not exist" Dec 03 08:00:37 crc kubenswrapper[4475]: I1203 08:00:37.498795 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f37b6ffc-7161-4404-942e-eeec3fda948a" path="/var/lib/kubelet/pods/f37b6ffc-7161-4404-942e-eeec3fda948a/volumes" Dec 03 08:00:58 crc kubenswrapper[4475]: I1203 08:00:58.933067 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:00:58 crc kubenswrapper[4475]: I1203 08:00:58.933398 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.132620 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412481-4mvmw"] Dec 03 08:01:00 crc kubenswrapper[4475]: E1203 08:01:00.133159 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37b6ffc-7161-4404-942e-eeec3fda948a" containerName="extract-utilities" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.133173 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37b6ffc-7161-4404-942e-eeec3fda948a" containerName="extract-utilities" Dec 03 08:01:00 crc kubenswrapper[4475]: E1203 08:01:00.133185 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37b6ffc-7161-4404-942e-eeec3fda948a" containerName="registry-server" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.133191 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37b6ffc-7161-4404-942e-eeec3fda948a" containerName="registry-server" Dec 03 08:01:00 crc kubenswrapper[4475]: E1203 08:01:00.133202 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37b6ffc-7161-4404-942e-eeec3fda948a" containerName="extract-content" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.133207 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37b6ffc-7161-4404-942e-eeec3fda948a" containerName="extract-content" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.133403 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37b6ffc-7161-4404-942e-eeec3fda948a" containerName="registry-server" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.133948 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.141060 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412481-4mvmw"] Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.238261 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-fernet-keys\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.238321 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-combined-ca-bundle\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.238377 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhqtl\" (UniqueName: \"kubernetes.io/projected/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-kube-api-access-vhqtl\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.238403 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-config-data\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.340190 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-combined-ca-bundle\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.340497 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhqtl\" (UniqueName: \"kubernetes.io/projected/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-kube-api-access-vhqtl\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.340623 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-config-data\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.340811 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-fernet-keys\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.345342 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-config-data\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.347260 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-fernet-keys\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.347626 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-combined-ca-bundle\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.366803 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhqtl\" (UniqueName: \"kubernetes.io/projected/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-kube-api-access-vhqtl\") pod \"keystone-cron-29412481-4mvmw\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.447251 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:00 crc kubenswrapper[4475]: I1203 08:01:00.842738 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412481-4mvmw"] Dec 03 08:01:01 crc kubenswrapper[4475]: I1203 08:01:01.087815 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412481-4mvmw" event={"ID":"079d8ec0-fbd9-4ed6-98f1-c6791b055bef","Type":"ContainerStarted","Data":"6952aaee523c375eec0ebcfd6f4e0479ce8e2cbc1ca2267f7a01ba4a4ea3cfe9"} Dec 03 08:01:01 crc kubenswrapper[4475]: I1203 08:01:01.087851 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412481-4mvmw" event={"ID":"079d8ec0-fbd9-4ed6-98f1-c6791b055bef","Type":"ContainerStarted","Data":"caa842b18edaa51538f9ec1c926d3e564872749c3585c840845f7856211e0f04"} Dec 03 08:01:01 crc kubenswrapper[4475]: I1203 08:01:01.099798 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412481-4mvmw" podStartSLOduration=1.099787766 podStartE2EDuration="1.099787766s" podCreationTimestamp="2025-12-03 08:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:01:01.09914559 +0000 UTC m=+4545.904043925" watchObservedRunningTime="2025-12-03 08:01:01.099787766 +0000 UTC m=+4545.904686101" Dec 03 08:01:04 crc kubenswrapper[4475]: I1203 08:01:04.109583 4475 generic.go:334] "Generic (PLEG): container finished" podID="079d8ec0-fbd9-4ed6-98f1-c6791b055bef" containerID="6952aaee523c375eec0ebcfd6f4e0479ce8e2cbc1ca2267f7a01ba4a4ea3cfe9" exitCode=0 Dec 03 08:01:04 crc kubenswrapper[4475]: I1203 08:01:04.109906 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412481-4mvmw" event={"ID":"079d8ec0-fbd9-4ed6-98f1-c6791b055bef","Type":"ContainerDied","Data":"6952aaee523c375eec0ebcfd6f4e0479ce8e2cbc1ca2267f7a01ba4a4ea3cfe9"} Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.464885 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.633713 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhqtl\" (UniqueName: \"kubernetes.io/projected/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-kube-api-access-vhqtl\") pod \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.633943 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-combined-ca-bundle\") pod \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.634013 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-config-data\") pod \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.634042 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-fernet-keys\") pod \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\" (UID: \"079d8ec0-fbd9-4ed6-98f1-c6791b055bef\") " Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.638088 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-kube-api-access-vhqtl" (OuterVolumeSpecName: "kube-api-access-vhqtl") pod "079d8ec0-fbd9-4ed6-98f1-c6791b055bef" (UID: "079d8ec0-fbd9-4ed6-98f1-c6791b055bef"). InnerVolumeSpecName "kube-api-access-vhqtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.639547 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "079d8ec0-fbd9-4ed6-98f1-c6791b055bef" (UID: "079d8ec0-fbd9-4ed6-98f1-c6791b055bef"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.657617 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "079d8ec0-fbd9-4ed6-98f1-c6791b055bef" (UID: "079d8ec0-fbd9-4ed6-98f1-c6791b055bef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.675738 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-config-data" (OuterVolumeSpecName: "config-data") pod "079d8ec0-fbd9-4ed6-98f1-c6791b055bef" (UID: "079d8ec0-fbd9-4ed6-98f1-c6791b055bef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.736598 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhqtl\" (UniqueName: \"kubernetes.io/projected/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-kube-api-access-vhqtl\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.736627 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.736637 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:05 crc kubenswrapper[4475]: I1203 08:01:05.736646 4475 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/079d8ec0-fbd9-4ed6-98f1-c6791b055bef-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:06 crc kubenswrapper[4475]: I1203 08:01:06.123894 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412481-4mvmw" event={"ID":"079d8ec0-fbd9-4ed6-98f1-c6791b055bef","Type":"ContainerDied","Data":"caa842b18edaa51538f9ec1c926d3e564872749c3585c840845f7856211e0f04"} Dec 03 08:01:06 crc kubenswrapper[4475]: I1203 08:01:06.123932 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caa842b18edaa51538f9ec1c926d3e564872749c3585c840845f7856211e0f04" Dec 03 08:01:06 crc kubenswrapper[4475]: I1203 08:01:06.123956 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412481-4mvmw" Dec 03 08:01:28 crc kubenswrapper[4475]: I1203 08:01:28.933212 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:01:28 crc kubenswrapper[4475]: I1203 08:01:28.933606 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:01:28 crc kubenswrapper[4475]: I1203 08:01:28.933644 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 08:01:28 crc kubenswrapper[4475]: I1203 08:01:28.934289 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:01:28 crc kubenswrapper[4475]: I1203 08:01:28.934345 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" gracePeriod=600 Dec 03 08:01:29 crc kubenswrapper[4475]: E1203 08:01:29.067134 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:01:29 crc kubenswrapper[4475]: I1203 08:01:29.273171 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" exitCode=0 Dec 03 08:01:29 crc kubenswrapper[4475]: I1203 08:01:29.273234 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e"} Dec 03 08:01:29 crc kubenswrapper[4475]: I1203 08:01:29.273306 4475 scope.go:117] "RemoveContainer" containerID="5fd6239910e877dca8b0b4140c098fd399692464452ceed3964f6b7732bcf773" Dec 03 08:01:29 crc kubenswrapper[4475]: I1203 08:01:29.273831 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:01:29 crc kubenswrapper[4475]: E1203 08:01:29.274066 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:01:40 crc kubenswrapper[4475]: I1203 08:01:40.491727 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:01:40 crc kubenswrapper[4475]: E1203 08:01:40.493051 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:01:55 crc kubenswrapper[4475]: I1203 08:01:55.495912 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:01:55 crc kubenswrapper[4475]: E1203 08:01:55.496714 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:02:07 crc kubenswrapper[4475]: I1203 08:02:07.490773 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:02:07 crc kubenswrapper[4475]: E1203 08:02:07.491311 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:02:20 crc kubenswrapper[4475]: I1203 08:02:20.491301 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:02:20 crc kubenswrapper[4475]: E1203 08:02:20.491893 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:02:35 crc kubenswrapper[4475]: I1203 08:02:35.495675 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:02:35 crc kubenswrapper[4475]: E1203 08:02:35.496228 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:02:46 crc kubenswrapper[4475]: I1203 08:02:46.491181 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:02:46 crc kubenswrapper[4475]: E1203 08:02:46.491765 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:03:01 crc kubenswrapper[4475]: I1203 08:03:01.491911 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:03:01 crc kubenswrapper[4475]: E1203 08:03:01.492527 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:03:13 crc kubenswrapper[4475]: I1203 08:03:13.491409 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:03:13 crc kubenswrapper[4475]: E1203 08:03:13.491951 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:03:26 crc kubenswrapper[4475]: I1203 08:03:26.491330 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:03:26 crc kubenswrapper[4475]: E1203 08:03:26.491929 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:03:38 crc kubenswrapper[4475]: I1203 08:03:38.491762 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:03:38 crc kubenswrapper[4475]: E1203 08:03:38.492281 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:03:51 crc kubenswrapper[4475]: I1203 08:03:51.493041 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:03:51 crc kubenswrapper[4475]: E1203 08:03:51.493541 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:04:02 crc kubenswrapper[4475]: I1203 08:04:02.266754 4475 generic.go:334] "Generic (PLEG): container finished" podID="6ebcee18-a6ef-4674-aea6-1b33ed3c2224" containerID="07fd218421b2f2f7ba0f15a17f31ca51d8b329ef859497e8f152006feba299e2" exitCode=0 Dec 03 08:04:02 crc kubenswrapper[4475]: I1203 08:04:02.266844 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"6ebcee18-a6ef-4674-aea6-1b33ed3c2224","Type":"ContainerDied","Data":"07fd218421b2f2f7ba0f15a17f31ca51d8b329ef859497e8f152006feba299e2"} Dec 03 08:04:02 crc kubenswrapper[4475]: I1203 08:04:02.493161 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:04:02 crc kubenswrapper[4475]: E1203 08:04:02.493543 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.840700 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.929890 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Dec 03 08:04:03 crc kubenswrapper[4475]: E1203 08:04:03.930191 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebcee18-a6ef-4674-aea6-1b33ed3c2224" containerName="tempest-tests-tempest-tests-runner" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.930204 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebcee18-a6ef-4674-aea6-1b33ed3c2224" containerName="tempest-tests-tempest-tests-runner" Dec 03 08:04:03 crc kubenswrapper[4475]: E1203 08:04:03.930237 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079d8ec0-fbd9-4ed6-98f1-c6791b055bef" containerName="keystone-cron" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.930243 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="079d8ec0-fbd9-4ed6-98f1-c6791b055bef" containerName="keystone-cron" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.930396 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ebcee18-a6ef-4674-aea6-1b33ed3c2224" containerName="tempest-tests-tempest-tests-runner" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.930418 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="079d8ec0-fbd9-4ed6-98f1-c6791b055bef" containerName="keystone-cron" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.930934 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.933792 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s1" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.933827 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s1" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.951661 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-config-data\") pod \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.951960 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thdrl\" (UniqueName: \"kubernetes.io/projected/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-kube-api-access-thdrl\") pod \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.951998 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-temporary\") pod \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.952027 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config-secret\") pod \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.952101 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ca-certs\") pod \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.952128 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ssh-key\") pod \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.952144 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-workdir\") pod \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.952168 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.952225 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config\") pod \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\" (UID: \"6ebcee18-a6ef-4674-aea6-1b33ed3c2224\") " Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.952298 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-config-data" (OuterVolumeSpecName: "config-data") pod "6ebcee18-a6ef-4674-aea6-1b33ed3c2224" (UID: "6ebcee18-a6ef-4674-aea6-1b33ed3c2224"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.952532 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.952661 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.953204 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.953248 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "6ebcee18-a6ef-4674-aea6-1b33ed3c2224" (UID: "6ebcee18-a6ef-4674-aea6-1b33ed3c2224"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.953346 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.958794 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-kube-api-access-thdrl" (OuterVolumeSpecName: "kube-api-access-thdrl") pod "6ebcee18-a6ef-4674-aea6-1b33ed3c2224" (UID: "6ebcee18-a6ef-4674-aea6-1b33ed3c2224"). InnerVolumeSpecName "kube-api-access-thdrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.964266 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "6ebcee18-a6ef-4674-aea6-1b33ed3c2224" (UID: "6ebcee18-a6ef-4674-aea6-1b33ed3c2224"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.976556 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6ebcee18-a6ef-4674-aea6-1b33ed3c2224" (UID: "6ebcee18-a6ef-4674-aea6-1b33ed3c2224"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.982654 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "6ebcee18-a6ef-4674-aea6-1b33ed3c2224" (UID: "6ebcee18-a6ef-4674-aea6-1b33ed3c2224"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.990781 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6ebcee18-a6ef-4674-aea6-1b33ed3c2224" (UID: "6ebcee18-a6ef-4674-aea6-1b33ed3c2224"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:04:03 crc kubenswrapper[4475]: I1203 08:04:03.993506 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.002196 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6ebcee18-a6ef-4674-aea6-1b33ed3c2224" (UID: "6ebcee18-a6ef-4674-aea6-1b33ed3c2224"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.012266 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "6ebcee18-a6ef-4674-aea6-1b33ed3c2224" (UID: "6ebcee18-a6ef-4674-aea6-1b33ed3c2224"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.058330 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qs8h\" (UniqueName: \"kubernetes.io/projected/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-kube-api-access-7qs8h\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.058541 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.058656 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.058772 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.058842 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.058936 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.059023 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.059095 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.059159 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.059352 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thdrl\" (UniqueName: \"kubernetes.io/projected/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-kube-api-access-thdrl\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.059410 4475 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.059495 4475 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.060959 4475 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.059559 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.060192 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.061171 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.061230 4475 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.061285 4475 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ebcee18-a6ef-4674-aea6-1b33ed3c2224-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.063733 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.083789 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.162569 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qs8h\" (UniqueName: \"kubernetes.io/projected/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-kube-api-access-7qs8h\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.162870 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.162893 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.162911 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.162926 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.163236 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.163348 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.165665 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.166001 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.175523 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qs8h\" (UniqueName: \"kubernetes.io/projected/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-kube-api-access-7qs8h\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.246719 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.281250 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"6ebcee18-a6ef-4674-aea6-1b33ed3c2224","Type":"ContainerDied","Data":"27bd8d4f9faeddab077847479debd5ad0b08e85acc45661623c3d44d1f7be12f"} Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.281285 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27bd8d4f9faeddab077847479debd5ad0b08e85acc45661623c3d44d1f7be12f" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.281333 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Dec 03 08:04:04 crc kubenswrapper[4475]: I1203 08:04:04.721196 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Dec 03 08:04:04 crc kubenswrapper[4475]: W1203 08:04:04.726931 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c152b07_dbe2_44d8_a333_ceff4d6e7f9a.slice/crio-d8699d2f7a543defdc669dd6390b09abe40b457bf887363c0f50fc6094f28eb2 WatchSource:0}: Error finding container d8699d2f7a543defdc669dd6390b09abe40b457bf887363c0f50fc6094f28eb2: Status 404 returned error can't find the container with id d8699d2f7a543defdc669dd6390b09abe40b457bf887363c0f50fc6094f28eb2 Dec 03 08:04:05 crc kubenswrapper[4475]: I1203 08:04:05.290324 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a","Type":"ContainerStarted","Data":"d8699d2f7a543defdc669dd6390b09abe40b457bf887363c0f50fc6094f28eb2"} Dec 03 08:04:06 crc kubenswrapper[4475]: I1203 08:04:06.298728 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a","Type":"ContainerStarted","Data":"e1b55e8417cc886ea0d3ab3d515efa2894df64e4a2ff1f75649d997f40d2e875"} Dec 03 08:04:06 crc kubenswrapper[4475]: I1203 08:04:06.315562 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" podStartSLOduration=3.315538371 podStartE2EDuration="3.315538371s" podCreationTimestamp="2025-12-03 08:04:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:04:06.311185237 +0000 UTC m=+4731.116083571" watchObservedRunningTime="2025-12-03 08:04:06.315538371 +0000 UTC m=+4731.120436706" Dec 03 08:04:15 crc kubenswrapper[4475]: I1203 08:04:15.495800 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:04:15 crc kubenswrapper[4475]: E1203 08:04:15.496483 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:04:29 crc kubenswrapper[4475]: I1203 08:04:29.490786 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:04:29 crc kubenswrapper[4475]: E1203 08:04:29.491540 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:04:43 crc kubenswrapper[4475]: I1203 08:04:43.492716 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:04:43 crc kubenswrapper[4475]: E1203 08:04:43.493333 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.324267 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6764df74d9-l26gg"] Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.326030 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.337043 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6764df74d9-l26gg"] Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.419157 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-public-tls-certs\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.419218 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-internal-tls-certs\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.419250 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-ovndb-tls-certs\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.419331 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzfnc\" (UniqueName: \"kubernetes.io/projected/356706b5-1a23-42f9-bbad-78dcc26dbddd-kube-api-access-bzfnc\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.419374 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-config\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.419420 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-combined-ca-bundle\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.419475 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-httpd-config\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.520899 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzfnc\" (UniqueName: \"kubernetes.io/projected/356706b5-1a23-42f9-bbad-78dcc26dbddd-kube-api-access-bzfnc\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.520954 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-config\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.520988 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-combined-ca-bundle\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.521894 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-httpd-config\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.521937 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-public-tls-certs\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.522065 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-internal-tls-certs\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.522122 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-ovndb-tls-certs\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.526474 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-config\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.526489 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-combined-ca-bundle\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.527302 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-ovndb-tls-certs\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.527784 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-internal-tls-certs\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.528113 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-httpd-config\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.539777 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-public-tls-certs\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.541632 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzfnc\" (UniqueName: \"kubernetes.io/projected/356706b5-1a23-42f9-bbad-78dcc26dbddd-kube-api-access-bzfnc\") pod \"neutron-6764df74d9-l26gg\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:47 crc kubenswrapper[4475]: I1203 08:04:47.639766 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:48 crc kubenswrapper[4475]: I1203 08:04:48.110677 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6764df74d9-l26gg"] Dec 03 08:04:48 crc kubenswrapper[4475]: I1203 08:04:48.569381 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6764df74d9-l26gg" event={"ID":"356706b5-1a23-42f9-bbad-78dcc26dbddd","Type":"ContainerStarted","Data":"76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503"} Dec 03 08:04:48 crc kubenswrapper[4475]: I1203 08:04:48.569625 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6764df74d9-l26gg" event={"ID":"356706b5-1a23-42f9-bbad-78dcc26dbddd","Type":"ContainerStarted","Data":"3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a"} Dec 03 08:04:48 crc kubenswrapper[4475]: I1203 08:04:48.569638 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6764df74d9-l26gg" event={"ID":"356706b5-1a23-42f9-bbad-78dcc26dbddd","Type":"ContainerStarted","Data":"25216b2f8bb18f50773fada67aa203dce5bb55e3be4b70440318054660945fa0"} Dec 03 08:04:48 crc kubenswrapper[4475]: I1203 08:04:48.569842 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:04:48 crc kubenswrapper[4475]: I1203 08:04:48.589350 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6764df74d9-l26gg" podStartSLOduration=1.5893406620000001 podStartE2EDuration="1.589340662s" podCreationTimestamp="2025-12-03 08:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:04:48.583380496 +0000 UTC m=+4773.388278830" watchObservedRunningTime="2025-12-03 08:04:48.589340662 +0000 UTC m=+4773.394238996" Dec 03 08:04:58 crc kubenswrapper[4475]: I1203 08:04:58.491304 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:04:58 crc kubenswrapper[4475]: E1203 08:04:58.492479 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.377514 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pkg75"] Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.380323 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.388176 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pkg75"] Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.556958 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-catalog-content\") pod \"redhat-operators-pkg75\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.557040 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-utilities\") pod \"redhat-operators-pkg75\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.557063 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75hc6\" (UniqueName: \"kubernetes.io/projected/b3dac047-08c4-4d56-8df2-18073a1378c5-kube-api-access-75hc6\") pod \"redhat-operators-pkg75\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.658881 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-catalog-content\") pod \"redhat-operators-pkg75\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.658965 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-utilities\") pod \"redhat-operators-pkg75\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.658985 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75hc6\" (UniqueName: \"kubernetes.io/projected/b3dac047-08c4-4d56-8df2-18073a1378c5-kube-api-access-75hc6\") pod \"redhat-operators-pkg75\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.659627 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-utilities\") pod \"redhat-operators-pkg75\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.659874 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-catalog-content\") pod \"redhat-operators-pkg75\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.676058 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75hc6\" (UniqueName: \"kubernetes.io/projected/b3dac047-08c4-4d56-8df2-18073a1378c5-kube-api-access-75hc6\") pod \"redhat-operators-pkg75\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:08 crc kubenswrapper[4475]: I1203 08:05:08.697928 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:09 crc kubenswrapper[4475]: I1203 08:05:09.167225 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pkg75"] Dec 03 08:05:09 crc kubenswrapper[4475]: I1203 08:05:09.709712 4475 generic.go:334] "Generic (PLEG): container finished" podID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerID="5731f0597818afbe4d3069bb45600ef675f2301cb3061f0845a15b5997fb3129" exitCode=0 Dec 03 08:05:09 crc kubenswrapper[4475]: I1203 08:05:09.709787 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkg75" event={"ID":"b3dac047-08c4-4d56-8df2-18073a1378c5","Type":"ContainerDied","Data":"5731f0597818afbe4d3069bb45600ef675f2301cb3061f0845a15b5997fb3129"} Dec 03 08:05:09 crc kubenswrapper[4475]: I1203 08:05:09.709941 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkg75" event={"ID":"b3dac047-08c4-4d56-8df2-18073a1378c5","Type":"ContainerStarted","Data":"8b116a45a94a71b26e5095d6df86c6b9b4df9090e3b080717a0b7148e0b857b9"} Dec 03 08:05:09 crc kubenswrapper[4475]: I1203 08:05:09.712293 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:05:11 crc kubenswrapper[4475]: I1203 08:05:11.742521 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkg75" event={"ID":"b3dac047-08c4-4d56-8df2-18073a1378c5","Type":"ContainerStarted","Data":"bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea"} Dec 03 08:05:12 crc kubenswrapper[4475]: I1203 08:05:12.490949 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:05:12 crc kubenswrapper[4475]: E1203 08:05:12.491509 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:05:13 crc kubenswrapper[4475]: I1203 08:05:13.756770 4475 generic.go:334] "Generic (PLEG): container finished" podID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerID="bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea" exitCode=0 Dec 03 08:05:13 crc kubenswrapper[4475]: I1203 08:05:13.756965 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkg75" event={"ID":"b3dac047-08c4-4d56-8df2-18073a1378c5","Type":"ContainerDied","Data":"bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea"} Dec 03 08:05:14 crc kubenswrapper[4475]: I1203 08:05:14.766101 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkg75" event={"ID":"b3dac047-08c4-4d56-8df2-18073a1378c5","Type":"ContainerStarted","Data":"b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a"} Dec 03 08:05:14 crc kubenswrapper[4475]: I1203 08:05:14.783776 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pkg75" podStartSLOduration=2.115284083 podStartE2EDuration="6.783762759s" podCreationTimestamp="2025-12-03 08:05:08 +0000 UTC" firstStartedPulling="2025-12-03 08:05:09.71094029 +0000 UTC m=+4794.515838624" lastFinishedPulling="2025-12-03 08:05:14.379418966 +0000 UTC m=+4799.184317300" observedRunningTime="2025-12-03 08:05:14.778166828 +0000 UTC m=+4799.583065163" watchObservedRunningTime="2025-12-03 08:05:14.783762759 +0000 UTC m=+4799.588661094" Dec 03 08:05:17 crc kubenswrapper[4475]: I1203 08:05:17.657211 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:05:17 crc kubenswrapper[4475]: I1203 08:05:17.779216 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8798f5df-qtg6w"] Dec 03 08:05:17 crc kubenswrapper[4475]: I1203 08:05:17.780238 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8798f5df-qtg6w" podUID="cf14cca6-0927-4a49-9c3b-70dc49f21c47" containerName="neutron-httpd" containerID="cri-o://3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e" gracePeriod=30 Dec 03 08:05:17 crc kubenswrapper[4475]: I1203 08:05:17.781564 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8798f5df-qtg6w" podUID="cf14cca6-0927-4a49-9c3b-70dc49f21c47" containerName="neutron-api" containerID="cri-o://440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a" gracePeriod=30 Dec 03 08:05:18 crc kubenswrapper[4475]: I1203 08:05:18.698497 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:18 crc kubenswrapper[4475]: I1203 08:05:18.699631 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:18 crc kubenswrapper[4475]: I1203 08:05:18.796049 4475 generic.go:334] "Generic (PLEG): container finished" podID="cf14cca6-0927-4a49-9c3b-70dc49f21c47" containerID="3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e" exitCode=0 Dec 03 08:05:18 crc kubenswrapper[4475]: I1203 08:05:18.796127 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8798f5df-qtg6w" event={"ID":"cf14cca6-0927-4a49-9c3b-70dc49f21c47","Type":"ContainerDied","Data":"3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e"} Dec 03 08:05:19 crc kubenswrapper[4475]: I1203 08:05:19.740616 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pkg75" podUID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerName="registry-server" probeResult="failure" output=< Dec 03 08:05:19 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 08:05:19 crc kubenswrapper[4475]: > Dec 03 08:05:27 crc kubenswrapper[4475]: I1203 08:05:27.494253 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:05:27 crc kubenswrapper[4475]: E1203 08:05:27.494843 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:05:28 crc kubenswrapper[4475]: I1203 08:05:28.735355 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:28 crc kubenswrapper[4475]: I1203 08:05:28.778895 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:28 crc kubenswrapper[4475]: I1203 08:05:28.963531 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pkg75"] Dec 03 08:05:29 crc kubenswrapper[4475]: I1203 08:05:29.861124 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pkg75" podUID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerName="registry-server" containerID="cri-o://b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a" gracePeriod=2 Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.310263 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.428098 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-catalog-content\") pod \"b3dac047-08c4-4d56-8df2-18073a1378c5\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.428318 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75hc6\" (UniqueName: \"kubernetes.io/projected/b3dac047-08c4-4d56-8df2-18073a1378c5-kube-api-access-75hc6\") pod \"b3dac047-08c4-4d56-8df2-18073a1378c5\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.428355 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-utilities\") pod \"b3dac047-08c4-4d56-8df2-18073a1378c5\" (UID: \"b3dac047-08c4-4d56-8df2-18073a1378c5\") " Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.435395 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-utilities" (OuterVolumeSpecName: "utilities") pod "b3dac047-08c4-4d56-8df2-18073a1378c5" (UID: "b3dac047-08c4-4d56-8df2-18073a1378c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.444673 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3dac047-08c4-4d56-8df2-18073a1378c5-kube-api-access-75hc6" (OuterVolumeSpecName: "kube-api-access-75hc6") pod "b3dac047-08c4-4d56-8df2-18073a1378c5" (UID: "b3dac047-08c4-4d56-8df2-18073a1378c5"). InnerVolumeSpecName "kube-api-access-75hc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.523671 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3dac047-08c4-4d56-8df2-18073a1378c5" (UID: "b3dac047-08c4-4d56-8df2-18073a1378c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.531063 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.531093 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75hc6\" (UniqueName: \"kubernetes.io/projected/b3dac047-08c4-4d56-8df2-18073a1378c5-kube-api-access-75hc6\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.531104 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3dac047-08c4-4d56-8df2-18073a1378c5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.869143 4475 generic.go:334] "Generic (PLEG): container finished" podID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerID="b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a" exitCode=0 Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.869183 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pkg75" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.869209 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkg75" event={"ID":"b3dac047-08c4-4d56-8df2-18073a1378c5","Type":"ContainerDied","Data":"b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a"} Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.875566 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkg75" event={"ID":"b3dac047-08c4-4d56-8df2-18073a1378c5","Type":"ContainerDied","Data":"8b116a45a94a71b26e5095d6df86c6b9b4df9090e3b080717a0b7148e0b857b9"} Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.875603 4475 scope.go:117] "RemoveContainer" containerID="b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.894073 4475 scope.go:117] "RemoveContainer" containerID="bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.900048 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pkg75"] Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.906319 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pkg75"] Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.910031 4475 scope.go:117] "RemoveContainer" containerID="5731f0597818afbe4d3069bb45600ef675f2301cb3061f0845a15b5997fb3129" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.942579 4475 scope.go:117] "RemoveContainer" containerID="b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a" Dec 03 08:05:30 crc kubenswrapper[4475]: E1203 08:05:30.954759 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a\": container with ID starting with b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a not found: ID does not exist" containerID="b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.954910 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a"} err="failed to get container status \"b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a\": rpc error: code = NotFound desc = could not find container \"b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a\": container with ID starting with b5f2710266254e381a01f96ff39202a9cd451083df240719ba3ee8042403485a not found: ID does not exist" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.954940 4475 scope.go:117] "RemoveContainer" containerID="bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea" Dec 03 08:05:30 crc kubenswrapper[4475]: E1203 08:05:30.962570 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea\": container with ID starting with bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea not found: ID does not exist" containerID="bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.962597 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea"} err="failed to get container status \"bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea\": rpc error: code = NotFound desc = could not find container \"bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea\": container with ID starting with bd5c8487a28e1ed99b1bdb819a9c041c51d15ede4a2a46f4cd0c4409aa2333ea not found: ID does not exist" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.962617 4475 scope.go:117] "RemoveContainer" containerID="5731f0597818afbe4d3069bb45600ef675f2301cb3061f0845a15b5997fb3129" Dec 03 08:05:30 crc kubenswrapper[4475]: E1203 08:05:30.962868 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5731f0597818afbe4d3069bb45600ef675f2301cb3061f0845a15b5997fb3129\": container with ID starting with 5731f0597818afbe4d3069bb45600ef675f2301cb3061f0845a15b5997fb3129 not found: ID does not exist" containerID="5731f0597818afbe4d3069bb45600ef675f2301cb3061f0845a15b5997fb3129" Dec 03 08:05:30 crc kubenswrapper[4475]: I1203 08:05:30.962908 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5731f0597818afbe4d3069bb45600ef675f2301cb3061f0845a15b5997fb3129"} err="failed to get container status \"5731f0597818afbe4d3069bb45600ef675f2301cb3061f0845a15b5997fb3129\": rpc error: code = NotFound desc = could not find container \"5731f0597818afbe4d3069bb45600ef675f2301cb3061f0845a15b5997fb3129\": container with ID starting with 5731f0597818afbe4d3069bb45600ef675f2301cb3061f0845a15b5997fb3129 not found: ID does not exist" Dec 03 08:05:31 crc kubenswrapper[4475]: I1203 08:05:31.498911 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3dac047-08c4-4d56-8df2-18073a1378c5" path="/var/lib/kubelet/pods/b3dac047-08c4-4d56-8df2-18073a1378c5/volumes" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.425136 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8798f5df-qtg6w" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.564772 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-internal-tls-certs\") pod \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.564829 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-config\") pod \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.564937 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hckp\" (UniqueName: \"kubernetes.io/projected/cf14cca6-0927-4a49-9c3b-70dc49f21c47-kube-api-access-8hckp\") pod \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.564998 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-public-tls-certs\") pod \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.565069 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-combined-ca-bundle\") pod \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.565096 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-httpd-config\") pod \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.565235 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-ovndb-tls-certs\") pod \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\" (UID: \"cf14cca6-0927-4a49-9c3b-70dc49f21c47\") " Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.576056 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cf14cca6-0927-4a49-9c3b-70dc49f21c47" (UID: "cf14cca6-0927-4a49-9c3b-70dc49f21c47"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.576070 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf14cca6-0927-4a49-9c3b-70dc49f21c47-kube-api-access-8hckp" (OuterVolumeSpecName: "kube-api-access-8hckp") pod "cf14cca6-0927-4a49-9c3b-70dc49f21c47" (UID: "cf14cca6-0927-4a49-9c3b-70dc49f21c47"). InnerVolumeSpecName "kube-api-access-8hckp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.602236 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cf14cca6-0927-4a49-9c3b-70dc49f21c47" (UID: "cf14cca6-0927-4a49-9c3b-70dc49f21c47"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.609320 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-config" (OuterVolumeSpecName: "config") pod "cf14cca6-0927-4a49-9c3b-70dc49f21c47" (UID: "cf14cca6-0927-4a49-9c3b-70dc49f21c47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.609865 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf14cca6-0927-4a49-9c3b-70dc49f21c47" (UID: "cf14cca6-0927-4a49-9c3b-70dc49f21c47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.623209 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf14cca6-0927-4a49-9c3b-70dc49f21c47" (UID: "cf14cca6-0927-4a49-9c3b-70dc49f21c47"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.629374 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cf14cca6-0927-4a49-9c3b-70dc49f21c47" (UID: "cf14cca6-0927-4a49-9c3b-70dc49f21c47"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.667775 4475 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.667802 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.667813 4475 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.667821 4475 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.667829 4475 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.667837 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf14cca6-0927-4a49-9c3b-70dc49f21c47-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.667845 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hckp\" (UniqueName: \"kubernetes.io/projected/cf14cca6-0927-4a49-9c3b-70dc49f21c47-kube-api-access-8hckp\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.894665 4475 generic.go:334] "Generic (PLEG): container finished" podID="cf14cca6-0927-4a49-9c3b-70dc49f21c47" containerID="440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a" exitCode=0 Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.894739 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8798f5df-qtg6w" event={"ID":"cf14cca6-0927-4a49-9c3b-70dc49f21c47","Type":"ContainerDied","Data":"440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a"} Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.894899 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8798f5df-qtg6w" event={"ID":"cf14cca6-0927-4a49-9c3b-70dc49f21c47","Type":"ContainerDied","Data":"1c08b1cd2617a128945439f98e79d34b60a3d1a3c4d829363e8f3d30574b9b79"} Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.894919 4475 scope.go:117] "RemoveContainer" containerID="3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.894763 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8798f5df-qtg6w" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.915772 4475 scope.go:117] "RemoveContainer" containerID="440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.922944 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8798f5df-qtg6w"] Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.928813 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8798f5df-qtg6w"] Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.934998 4475 scope.go:117] "RemoveContainer" containerID="3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e" Dec 03 08:05:32 crc kubenswrapper[4475]: E1203 08:05:32.935313 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e\": container with ID starting with 3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e not found: ID does not exist" containerID="3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.935346 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e"} err="failed to get container status \"3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e\": rpc error: code = NotFound desc = could not find container \"3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e\": container with ID starting with 3ae21905fddfd4a80f665d37ed6f4dd956f5c267d985a75828da67f6384d111e not found: ID does not exist" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.935368 4475 scope.go:117] "RemoveContainer" containerID="440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a" Dec 03 08:05:32 crc kubenswrapper[4475]: E1203 08:05:32.935657 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a\": container with ID starting with 440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a not found: ID does not exist" containerID="440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a" Dec 03 08:05:32 crc kubenswrapper[4475]: I1203 08:05:32.935684 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a"} err="failed to get container status \"440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a\": rpc error: code = NotFound desc = could not find container \"440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a\": container with ID starting with 440dc37fef8814f783568a1d202a9a26c3ea3934e1de321caac29f8d17bc4f8a not found: ID does not exist" Dec 03 08:05:33 crc kubenswrapper[4475]: I1203 08:05:33.500577 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf14cca6-0927-4a49-9c3b-70dc49f21c47" path="/var/lib/kubelet/pods/cf14cca6-0927-4a49-9c3b-70dc49f21c47/volumes" Dec 03 08:05:40 crc kubenswrapper[4475]: I1203 08:05:40.490755 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:05:40 crc kubenswrapper[4475]: E1203 08:05:40.492978 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:05:54 crc kubenswrapper[4475]: I1203 08:05:54.491310 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:05:54 crc kubenswrapper[4475]: E1203 08:05:54.491842 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.491128 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:06:06 crc kubenswrapper[4475]: E1203 08:06:06.492274 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.580656 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h5qh4"] Dec 03 08:06:06 crc kubenswrapper[4475]: E1203 08:06:06.580968 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerName="extract-content" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.580985 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerName="extract-content" Dec 03 08:06:06 crc kubenswrapper[4475]: E1203 08:06:06.580998 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerName="registry-server" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.581004 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerName="registry-server" Dec 03 08:06:06 crc kubenswrapper[4475]: E1203 08:06:06.581022 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf14cca6-0927-4a49-9c3b-70dc49f21c47" containerName="neutron-api" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.581028 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf14cca6-0927-4a49-9c3b-70dc49f21c47" containerName="neutron-api" Dec 03 08:06:06 crc kubenswrapper[4475]: E1203 08:06:06.581046 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerName="extract-utilities" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.581051 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerName="extract-utilities" Dec 03 08:06:06 crc kubenswrapper[4475]: E1203 08:06:06.581071 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf14cca6-0927-4a49-9c3b-70dc49f21c47" containerName="neutron-httpd" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.581076 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf14cca6-0927-4a49-9c3b-70dc49f21c47" containerName="neutron-httpd" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.581228 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf14cca6-0927-4a49-9c3b-70dc49f21c47" containerName="neutron-httpd" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.581242 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf14cca6-0927-4a49-9c3b-70dc49f21c47" containerName="neutron-api" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.581258 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3dac047-08c4-4d56-8df2-18073a1378c5" containerName="registry-server" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.585185 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.590753 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5qh4"] Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.623978 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-utilities\") pod \"community-operators-h5qh4\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.624017 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k58qn\" (UniqueName: \"kubernetes.io/projected/35d44777-e78c-4101-bd31-46e460b2f334-kube-api-access-k58qn\") pod \"community-operators-h5qh4\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.624113 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-catalog-content\") pod \"community-operators-h5qh4\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.725393 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-catalog-content\") pod \"community-operators-h5qh4\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.725577 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-utilities\") pod \"community-operators-h5qh4\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.725613 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k58qn\" (UniqueName: \"kubernetes.io/projected/35d44777-e78c-4101-bd31-46e460b2f334-kube-api-access-k58qn\") pod \"community-operators-h5qh4\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.725818 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-catalog-content\") pod \"community-operators-h5qh4\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.725878 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-utilities\") pod \"community-operators-h5qh4\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.741968 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k58qn\" (UniqueName: \"kubernetes.io/projected/35d44777-e78c-4101-bd31-46e460b2f334-kube-api-access-k58qn\") pod \"community-operators-h5qh4\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:06 crc kubenswrapper[4475]: I1203 08:06:06.898835 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:07 crc kubenswrapper[4475]: I1203 08:06:07.311232 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5qh4"] Dec 03 08:06:07 crc kubenswrapper[4475]: W1203 08:06:07.317381 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d44777_e78c_4101_bd31_46e460b2f334.slice/crio-a3e89ac407c89c6670cbf6e1854973d330928823cd6abad8f0e5c5dad13947ba WatchSource:0}: Error finding container a3e89ac407c89c6670cbf6e1854973d330928823cd6abad8f0e5c5dad13947ba: Status 404 returned error can't find the container with id a3e89ac407c89c6670cbf6e1854973d330928823cd6abad8f0e5c5dad13947ba Dec 03 08:06:08 crc kubenswrapper[4475]: I1203 08:06:08.110986 4475 generic.go:334] "Generic (PLEG): container finished" podID="35d44777-e78c-4101-bd31-46e460b2f334" containerID="5aa5005bc80d977c3e726510dd6a1f652cd049d7795eb26eaa4af5183dd4939c" exitCode=0 Dec 03 08:06:08 crc kubenswrapper[4475]: I1203 08:06:08.111021 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qh4" event={"ID":"35d44777-e78c-4101-bd31-46e460b2f334","Type":"ContainerDied","Data":"5aa5005bc80d977c3e726510dd6a1f652cd049d7795eb26eaa4af5183dd4939c"} Dec 03 08:06:08 crc kubenswrapper[4475]: I1203 08:06:08.111057 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qh4" event={"ID":"35d44777-e78c-4101-bd31-46e460b2f334","Type":"ContainerStarted","Data":"a3e89ac407c89c6670cbf6e1854973d330928823cd6abad8f0e5c5dad13947ba"} Dec 03 08:06:09 crc kubenswrapper[4475]: I1203 08:06:09.118514 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qh4" event={"ID":"35d44777-e78c-4101-bd31-46e460b2f334","Type":"ContainerStarted","Data":"4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a"} Dec 03 08:06:10 crc kubenswrapper[4475]: I1203 08:06:10.125718 4475 generic.go:334] "Generic (PLEG): container finished" podID="35d44777-e78c-4101-bd31-46e460b2f334" containerID="4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a" exitCode=0 Dec 03 08:06:10 crc kubenswrapper[4475]: I1203 08:06:10.125785 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qh4" event={"ID":"35d44777-e78c-4101-bd31-46e460b2f334","Type":"ContainerDied","Data":"4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a"} Dec 03 08:06:11 crc kubenswrapper[4475]: I1203 08:06:11.134350 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qh4" event={"ID":"35d44777-e78c-4101-bd31-46e460b2f334","Type":"ContainerStarted","Data":"915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e"} Dec 03 08:06:11 crc kubenswrapper[4475]: I1203 08:06:11.153220 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h5qh4" podStartSLOduration=2.569389981 podStartE2EDuration="5.153205359s" podCreationTimestamp="2025-12-03 08:06:06 +0000 UTC" firstStartedPulling="2025-12-03 08:06:08.112884939 +0000 UTC m=+4852.917783274" lastFinishedPulling="2025-12-03 08:06:10.696700317 +0000 UTC m=+4855.501598652" observedRunningTime="2025-12-03 08:06:11.151148381 +0000 UTC m=+4855.956046715" watchObservedRunningTime="2025-12-03 08:06:11.153205359 +0000 UTC m=+4855.958103692" Dec 03 08:06:16 crc kubenswrapper[4475]: I1203 08:06:16.899488 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:16 crc kubenswrapper[4475]: I1203 08:06:16.900625 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:16 crc kubenswrapper[4475]: I1203 08:06:16.933353 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:17 crc kubenswrapper[4475]: I1203 08:06:17.206583 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:17 crc kubenswrapper[4475]: I1203 08:06:17.253115 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5qh4"] Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.188288 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h5qh4" podUID="35d44777-e78c-4101-bd31-46e460b2f334" containerName="registry-server" containerID="cri-o://915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e" gracePeriod=2 Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.491154 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:06:19 crc kubenswrapper[4475]: E1203 08:06:19.491616 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.563547 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.734574 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-catalog-content\") pod \"35d44777-e78c-4101-bd31-46e460b2f334\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.734644 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k58qn\" (UniqueName: \"kubernetes.io/projected/35d44777-e78c-4101-bd31-46e460b2f334-kube-api-access-k58qn\") pod \"35d44777-e78c-4101-bd31-46e460b2f334\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.734939 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-utilities\") pod \"35d44777-e78c-4101-bd31-46e460b2f334\" (UID: \"35d44777-e78c-4101-bd31-46e460b2f334\") " Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.735922 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-utilities" (OuterVolumeSpecName: "utilities") pod "35d44777-e78c-4101-bd31-46e460b2f334" (UID: "35d44777-e78c-4101-bd31-46e460b2f334"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.740610 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d44777-e78c-4101-bd31-46e460b2f334-kube-api-access-k58qn" (OuterVolumeSpecName: "kube-api-access-k58qn") pod "35d44777-e78c-4101-bd31-46e460b2f334" (UID: "35d44777-e78c-4101-bd31-46e460b2f334"). InnerVolumeSpecName "kube-api-access-k58qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.771181 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35d44777-e78c-4101-bd31-46e460b2f334" (UID: "35d44777-e78c-4101-bd31-46e460b2f334"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.836799 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.836834 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d44777-e78c-4101-bd31-46e460b2f334-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:19 crc kubenswrapper[4475]: I1203 08:06:19.836855 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k58qn\" (UniqueName: \"kubernetes.io/projected/35d44777-e78c-4101-bd31-46e460b2f334-kube-api-access-k58qn\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.196210 4475 generic.go:334] "Generic (PLEG): container finished" podID="35d44777-e78c-4101-bd31-46e460b2f334" containerID="915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e" exitCode=0 Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.196279 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5qh4" Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.196276 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qh4" event={"ID":"35d44777-e78c-4101-bd31-46e460b2f334","Type":"ContainerDied","Data":"915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e"} Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.197187 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qh4" event={"ID":"35d44777-e78c-4101-bd31-46e460b2f334","Type":"ContainerDied","Data":"a3e89ac407c89c6670cbf6e1854973d330928823cd6abad8f0e5c5dad13947ba"} Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.197208 4475 scope.go:117] "RemoveContainer" containerID="915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e" Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.212125 4475 scope.go:117] "RemoveContainer" containerID="4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a" Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.233700 4475 scope.go:117] "RemoveContainer" containerID="5aa5005bc80d977c3e726510dd6a1f652cd049d7795eb26eaa4af5183dd4939c" Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.235996 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5qh4"] Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.246056 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h5qh4"] Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.261979 4475 scope.go:117] "RemoveContainer" containerID="915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e" Dec 03 08:06:20 crc kubenswrapper[4475]: E1203 08:06:20.262306 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e\": container with ID starting with 915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e not found: ID does not exist" containerID="915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e" Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.262334 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e"} err="failed to get container status \"915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e\": rpc error: code = NotFound desc = could not find container \"915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e\": container with ID starting with 915dc39a394ed2e26f74f30659bea604c7713f98558f33b298ffb64a4c499c4e not found: ID does not exist" Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.262353 4475 scope.go:117] "RemoveContainer" containerID="4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a" Dec 03 08:06:20 crc kubenswrapper[4475]: E1203 08:06:20.262720 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a\": container with ID starting with 4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a not found: ID does not exist" containerID="4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a" Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.262759 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a"} err="failed to get container status \"4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a\": rpc error: code = NotFound desc = could not find container \"4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a\": container with ID starting with 4cd2d1854e222862b09377ec19d1f06310bc37e4bff39d9f1409cec9bf50e01a not found: ID does not exist" Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.262773 4475 scope.go:117] "RemoveContainer" containerID="5aa5005bc80d977c3e726510dd6a1f652cd049d7795eb26eaa4af5183dd4939c" Dec 03 08:06:20 crc kubenswrapper[4475]: E1203 08:06:20.263048 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aa5005bc80d977c3e726510dd6a1f652cd049d7795eb26eaa4af5183dd4939c\": container with ID starting with 5aa5005bc80d977c3e726510dd6a1f652cd049d7795eb26eaa4af5183dd4939c not found: ID does not exist" containerID="5aa5005bc80d977c3e726510dd6a1f652cd049d7795eb26eaa4af5183dd4939c" Dec 03 08:06:20 crc kubenswrapper[4475]: I1203 08:06:20.263072 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa5005bc80d977c3e726510dd6a1f652cd049d7795eb26eaa4af5183dd4939c"} err="failed to get container status \"5aa5005bc80d977c3e726510dd6a1f652cd049d7795eb26eaa4af5183dd4939c\": rpc error: code = NotFound desc = could not find container \"5aa5005bc80d977c3e726510dd6a1f652cd049d7795eb26eaa4af5183dd4939c\": container with ID starting with 5aa5005bc80d977c3e726510dd6a1f652cd049d7795eb26eaa4af5183dd4939c not found: ID does not exist" Dec 03 08:06:21 crc kubenswrapper[4475]: I1203 08:06:21.498851 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35d44777-e78c-4101-bd31-46e460b2f334" path="/var/lib/kubelet/pods/35d44777-e78c-4101-bd31-46e460b2f334/volumes" Dec 03 08:06:33 crc kubenswrapper[4475]: I1203 08:06:33.491751 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:06:34 crc kubenswrapper[4475]: I1203 08:06:34.295846 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"231211e29e49928b9fe859fe2cab1239bf3f55652cc1b763bc66b1b5957606a5"} Dec 03 08:08:58 crc kubenswrapper[4475]: I1203 08:08:58.933385 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:08:58 crc kubenswrapper[4475]: I1203 08:08:58.933972 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:09:28 crc kubenswrapper[4475]: I1203 08:09:28.933174 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:09:28 crc kubenswrapper[4475]: I1203 08:09:28.933672 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:09:49 crc kubenswrapper[4475]: I1203 08:09:49.611740 4475 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-88869455c-74p7r" podUID="16bbbd37-150f-4b54-8fc1-eb7708ecca88" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 03 08:09:58 crc kubenswrapper[4475]: I1203 08:09:58.933907 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:09:58 crc kubenswrapper[4475]: I1203 08:09:58.935053 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:09:58 crc kubenswrapper[4475]: I1203 08:09:58.935136 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 08:09:58 crc kubenswrapper[4475]: I1203 08:09:58.935959 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"231211e29e49928b9fe859fe2cab1239bf3f55652cc1b763bc66b1b5957606a5"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:09:58 crc kubenswrapper[4475]: I1203 08:09:58.936047 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://231211e29e49928b9fe859fe2cab1239bf3f55652cc1b763bc66b1b5957606a5" gracePeriod=600 Dec 03 08:09:59 crc kubenswrapper[4475]: I1203 08:09:59.705328 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="231211e29e49928b9fe859fe2cab1239bf3f55652cc1b763bc66b1b5957606a5" exitCode=0 Dec 03 08:09:59 crc kubenswrapper[4475]: I1203 08:09:59.705412 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"231211e29e49928b9fe859fe2cab1239bf3f55652cc1b763bc66b1b5957606a5"} Dec 03 08:09:59 crc kubenswrapper[4475]: I1203 08:09:59.705945 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052"} Dec 03 08:09:59 crc kubenswrapper[4475]: I1203 08:09:59.705970 4475 scope.go:117] "RemoveContainer" containerID="5673dd08c62af6854c20f3bcbbfe224728fe60de8b224e4d69ca056cc25a013e" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.271515 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tjbtl"] Dec 03 08:12:14 crc kubenswrapper[4475]: E1203 08:12:14.276260 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d44777-e78c-4101-bd31-46e460b2f334" containerName="registry-server" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.276291 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d44777-e78c-4101-bd31-46e460b2f334" containerName="registry-server" Dec 03 08:12:14 crc kubenswrapper[4475]: E1203 08:12:14.276314 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d44777-e78c-4101-bd31-46e460b2f334" containerName="extract-utilities" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.276325 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d44777-e78c-4101-bd31-46e460b2f334" containerName="extract-utilities" Dec 03 08:12:14 crc kubenswrapper[4475]: E1203 08:12:14.276334 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d44777-e78c-4101-bd31-46e460b2f334" containerName="extract-content" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.276342 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d44777-e78c-4101-bd31-46e460b2f334" containerName="extract-content" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.276642 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d44777-e78c-4101-bd31-46e460b2f334" containerName="registry-server" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.278804 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.288364 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tjbtl"] Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.348757 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-utilities\") pod \"certified-operators-tjbtl\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.349051 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5btb\" (UniqueName: \"kubernetes.io/projected/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-kube-api-access-r5btb\") pod \"certified-operators-tjbtl\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.349098 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-catalog-content\") pod \"certified-operators-tjbtl\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.452364 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-utilities\") pod \"certified-operators-tjbtl\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.452717 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5btb\" (UniqueName: \"kubernetes.io/projected/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-kube-api-access-r5btb\") pod \"certified-operators-tjbtl\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.452772 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-catalog-content\") pod \"certified-operators-tjbtl\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.453526 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-utilities\") pod \"certified-operators-tjbtl\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.453587 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-catalog-content\") pod \"certified-operators-tjbtl\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.483133 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5btb\" (UniqueName: \"kubernetes.io/projected/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-kube-api-access-r5btb\") pod \"certified-operators-tjbtl\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:14 crc kubenswrapper[4475]: I1203 08:12:14.601036 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:15 crc kubenswrapper[4475]: I1203 08:12:15.368180 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tjbtl"] Dec 03 08:12:15 crc kubenswrapper[4475]: I1203 08:12:15.939440 4475 generic.go:334] "Generic (PLEG): container finished" podID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" containerID="c9cc6bf705697ddc344b10e3493290799bc5c6059633e12be7b9b821f8d92b0f" exitCode=0 Dec 03 08:12:15 crc kubenswrapper[4475]: I1203 08:12:15.939547 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjbtl" event={"ID":"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed","Type":"ContainerDied","Data":"c9cc6bf705697ddc344b10e3493290799bc5c6059633e12be7b9b821f8d92b0f"} Dec 03 08:12:15 crc kubenswrapper[4475]: I1203 08:12:15.939812 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjbtl" event={"ID":"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed","Type":"ContainerStarted","Data":"b122efa4ba163c519b6368b95df74aff5b6779204ff657ea043f4d69c9d95a78"} Dec 03 08:12:15 crc kubenswrapper[4475]: I1203 08:12:15.942090 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.063728 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j249d"] Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.065852 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.086824 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j249d"] Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.107742 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-utilities\") pod \"redhat-marketplace-j249d\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.108221 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-catalog-content\") pod \"redhat-marketplace-j249d\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.108255 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff9dz\" (UniqueName: \"kubernetes.io/projected/e92612e9-991d-44e2-9670-d8a7d97064e5-kube-api-access-ff9dz\") pod \"redhat-marketplace-j249d\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.209918 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-utilities\") pod \"redhat-marketplace-j249d\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.210296 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-catalog-content\") pod \"redhat-marketplace-j249d\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.210389 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff9dz\" (UniqueName: \"kubernetes.io/projected/e92612e9-991d-44e2-9670-d8a7d97064e5-kube-api-access-ff9dz\") pod \"redhat-marketplace-j249d\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.210778 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-utilities\") pod \"redhat-marketplace-j249d\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.210850 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-catalog-content\") pod \"redhat-marketplace-j249d\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.243586 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff9dz\" (UniqueName: \"kubernetes.io/projected/e92612e9-991d-44e2-9670-d8a7d97064e5-kube-api-access-ff9dz\") pod \"redhat-marketplace-j249d\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.396172 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.871705 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j249d"] Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.949893 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j249d" event={"ID":"e92612e9-991d-44e2-9670-d8a7d97064e5","Type":"ContainerStarted","Data":"de96d0e82f3c2a078fd6895b85cddd08ab5aa56e4302d53ab25d86a7adb19b7f"} Dec 03 08:12:16 crc kubenswrapper[4475]: I1203 08:12:16.951719 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjbtl" event={"ID":"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed","Type":"ContainerStarted","Data":"94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775"} Dec 03 08:12:17 crc kubenswrapper[4475]: I1203 08:12:17.962025 4475 generic.go:334] "Generic (PLEG): container finished" podID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" containerID="94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775" exitCode=0 Dec 03 08:12:17 crc kubenswrapper[4475]: I1203 08:12:17.962132 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjbtl" event={"ID":"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed","Type":"ContainerDied","Data":"94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775"} Dec 03 08:12:17 crc kubenswrapper[4475]: I1203 08:12:17.964179 4475 generic.go:334] "Generic (PLEG): container finished" podID="e92612e9-991d-44e2-9670-d8a7d97064e5" containerID="21f7bd435e57e970067831fd580a9d2a331378eea7084d5b5905818686bbef1d" exitCode=0 Dec 03 08:12:17 crc kubenswrapper[4475]: I1203 08:12:17.964222 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j249d" event={"ID":"e92612e9-991d-44e2-9670-d8a7d97064e5","Type":"ContainerDied","Data":"21f7bd435e57e970067831fd580a9d2a331378eea7084d5b5905818686bbef1d"} Dec 03 08:12:18 crc kubenswrapper[4475]: I1203 08:12:18.972975 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j249d" event={"ID":"e92612e9-991d-44e2-9670-d8a7d97064e5","Type":"ContainerStarted","Data":"01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88"} Dec 03 08:12:18 crc kubenswrapper[4475]: I1203 08:12:18.976549 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjbtl" event={"ID":"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed","Type":"ContainerStarted","Data":"d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035"} Dec 03 08:12:19 crc kubenswrapper[4475]: I1203 08:12:19.016789 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tjbtl" podStartSLOduration=2.403728994 podStartE2EDuration="5.016769416s" podCreationTimestamp="2025-12-03 08:12:14 +0000 UTC" firstStartedPulling="2025-12-03 08:12:15.941844873 +0000 UTC m=+5220.746743208" lastFinishedPulling="2025-12-03 08:12:18.554885296 +0000 UTC m=+5223.359783630" observedRunningTime="2025-12-03 08:12:19.010169849 +0000 UTC m=+5223.815068183" watchObservedRunningTime="2025-12-03 08:12:19.016769416 +0000 UTC m=+5223.821667751" Dec 03 08:12:19 crc kubenswrapper[4475]: I1203 08:12:19.986729 4475 generic.go:334] "Generic (PLEG): container finished" podID="e92612e9-991d-44e2-9670-d8a7d97064e5" containerID="01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88" exitCode=0 Dec 03 08:12:19 crc kubenswrapper[4475]: I1203 08:12:19.986799 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j249d" event={"ID":"e92612e9-991d-44e2-9670-d8a7d97064e5","Type":"ContainerDied","Data":"01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88"} Dec 03 08:12:20 crc kubenswrapper[4475]: I1203 08:12:20.998370 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j249d" event={"ID":"e92612e9-991d-44e2-9670-d8a7d97064e5","Type":"ContainerStarted","Data":"4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb"} Dec 03 08:12:24 crc kubenswrapper[4475]: I1203 08:12:24.601558 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:24 crc kubenswrapper[4475]: I1203 08:12:24.602066 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:24 crc kubenswrapper[4475]: I1203 08:12:24.641965 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:24 crc kubenswrapper[4475]: I1203 08:12:24.663854 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j249d" podStartSLOduration=5.866059048 podStartE2EDuration="8.663838707s" podCreationTimestamp="2025-12-03 08:12:16 +0000 UTC" firstStartedPulling="2025-12-03 08:12:17.966263433 +0000 UTC m=+5222.771161767" lastFinishedPulling="2025-12-03 08:12:20.764043093 +0000 UTC m=+5225.568941426" observedRunningTime="2025-12-03 08:12:21.018029598 +0000 UTC m=+5225.822927931" watchObservedRunningTime="2025-12-03 08:12:24.663838707 +0000 UTC m=+5229.468737040" Dec 03 08:12:25 crc kubenswrapper[4475]: I1203 08:12:25.068578 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:25 crc kubenswrapper[4475]: I1203 08:12:25.452093 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tjbtl"] Dec 03 08:12:26 crc kubenswrapper[4475]: I1203 08:12:26.396738 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:26 crc kubenswrapper[4475]: I1203 08:12:26.396779 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:26 crc kubenswrapper[4475]: I1203 08:12:26.433978 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.050372 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tjbtl" podUID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" containerName="registry-server" containerID="cri-o://d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035" gracePeriod=2 Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.084003 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.606018 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.773712 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5btb\" (UniqueName: \"kubernetes.io/projected/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-kube-api-access-r5btb\") pod \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.774070 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-catalog-content\") pod \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.774107 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-utilities\") pod \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\" (UID: \"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed\") " Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.775434 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-utilities" (OuterVolumeSpecName: "utilities") pod "48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" (UID: "48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.783941 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-kube-api-access-r5btb" (OuterVolumeSpecName: "kube-api-access-r5btb") pod "48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" (UID: "48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed"). InnerVolumeSpecName "kube-api-access-r5btb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.818086 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" (UID: "48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.877107 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5btb\" (UniqueName: \"kubernetes.io/projected/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-kube-api-access-r5btb\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.877158 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:27 crc kubenswrapper[4475]: I1203 08:12:27.877168 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.051602 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j249d"] Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.059878 4475 generic.go:334] "Generic (PLEG): container finished" podID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" containerID="d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035" exitCode=0 Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.059942 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjbtl" Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.059952 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjbtl" event={"ID":"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed","Type":"ContainerDied","Data":"d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035"} Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.059988 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjbtl" event={"ID":"48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed","Type":"ContainerDied","Data":"b122efa4ba163c519b6368b95df74aff5b6779204ff657ea043f4d69c9d95a78"} Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.060009 4475 scope.go:117] "RemoveContainer" containerID="d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035" Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.089205 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tjbtl"] Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.090077 4475 scope.go:117] "RemoveContainer" containerID="94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775" Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.098321 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tjbtl"] Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.111516 4475 scope.go:117] "RemoveContainer" containerID="c9cc6bf705697ddc344b10e3493290799bc5c6059633e12be7b9b821f8d92b0f" Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.145104 4475 scope.go:117] "RemoveContainer" containerID="d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035" Dec 03 08:12:28 crc kubenswrapper[4475]: E1203 08:12:28.148599 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035\": container with ID starting with d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035 not found: ID does not exist" containerID="d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035" Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.149141 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035"} err="failed to get container status \"d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035\": rpc error: code = NotFound desc = could not find container \"d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035\": container with ID starting with d31d2e026989b44517cffdc3aa3cc15b70323566fee5197fcc4a124a940e2035 not found: ID does not exist" Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.149172 4475 scope.go:117] "RemoveContainer" containerID="94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775" Dec 03 08:12:28 crc kubenswrapper[4475]: E1203 08:12:28.149526 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775\": container with ID starting with 94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775 not found: ID does not exist" containerID="94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775" Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.149548 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775"} err="failed to get container status \"94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775\": rpc error: code = NotFound desc = could not find container \"94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775\": container with ID starting with 94deedb057a805fae383e567489874e0bdab6e367353cd91ce6ebba6e8bff775 not found: ID does not exist" Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.149561 4475 scope.go:117] "RemoveContainer" containerID="c9cc6bf705697ddc344b10e3493290799bc5c6059633e12be7b9b821f8d92b0f" Dec 03 08:12:28 crc kubenswrapper[4475]: E1203 08:12:28.149860 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9cc6bf705697ddc344b10e3493290799bc5c6059633e12be7b9b821f8d92b0f\": container with ID starting with c9cc6bf705697ddc344b10e3493290799bc5c6059633e12be7b9b821f8d92b0f not found: ID does not exist" containerID="c9cc6bf705697ddc344b10e3493290799bc5c6059633e12be7b9b821f8d92b0f" Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.149881 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9cc6bf705697ddc344b10e3493290799bc5c6059633e12be7b9b821f8d92b0f"} err="failed to get container status \"c9cc6bf705697ddc344b10e3493290799bc5c6059633e12be7b9b821f8d92b0f\": rpc error: code = NotFound desc = could not find container \"c9cc6bf705697ddc344b10e3493290799bc5c6059633e12be7b9b821f8d92b0f\": container with ID starting with c9cc6bf705697ddc344b10e3493290799bc5c6059633e12be7b9b821f8d92b0f not found: ID does not exist" Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.933983 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:12:28 crc kubenswrapper[4475]: I1203 08:12:28.934814 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.071819 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j249d" podUID="e92612e9-991d-44e2-9670-d8a7d97064e5" containerName="registry-server" containerID="cri-o://4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb" gracePeriod=2 Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.491705 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.500814 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" path="/var/lib/kubelet/pods/48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed/volumes" Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.506029 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-catalog-content\") pod \"e92612e9-991d-44e2-9670-d8a7d97064e5\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.506140 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-utilities\") pod \"e92612e9-991d-44e2-9670-d8a7d97064e5\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.506293 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff9dz\" (UniqueName: \"kubernetes.io/projected/e92612e9-991d-44e2-9670-d8a7d97064e5-kube-api-access-ff9dz\") pod \"e92612e9-991d-44e2-9670-d8a7d97064e5\" (UID: \"e92612e9-991d-44e2-9670-d8a7d97064e5\") " Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.508009 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-utilities" (OuterVolumeSpecName: "utilities") pod "e92612e9-991d-44e2-9670-d8a7d97064e5" (UID: "e92612e9-991d-44e2-9670-d8a7d97064e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.517070 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e92612e9-991d-44e2-9670-d8a7d97064e5-kube-api-access-ff9dz" (OuterVolumeSpecName: "kube-api-access-ff9dz") pod "e92612e9-991d-44e2-9670-d8a7d97064e5" (UID: "e92612e9-991d-44e2-9670-d8a7d97064e5"). InnerVolumeSpecName "kube-api-access-ff9dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.522048 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e92612e9-991d-44e2-9670-d8a7d97064e5" (UID: "e92612e9-991d-44e2-9670-d8a7d97064e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.608109 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.608220 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e92612e9-991d-44e2-9670-d8a7d97064e5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:29 crc kubenswrapper[4475]: I1203 08:12:29.608275 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff9dz\" (UniqueName: \"kubernetes.io/projected/e92612e9-991d-44e2-9670-d8a7d97064e5-kube-api-access-ff9dz\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.082904 4475 generic.go:334] "Generic (PLEG): container finished" podID="e92612e9-991d-44e2-9670-d8a7d97064e5" containerID="4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb" exitCode=0 Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.082950 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j249d" event={"ID":"e92612e9-991d-44e2-9670-d8a7d97064e5","Type":"ContainerDied","Data":"4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb"} Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.082986 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j249d" event={"ID":"e92612e9-991d-44e2-9670-d8a7d97064e5","Type":"ContainerDied","Data":"de96d0e82f3c2a078fd6895b85cddd08ab5aa56e4302d53ab25d86a7adb19b7f"} Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.083005 4475 scope.go:117] "RemoveContainer" containerID="4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb" Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.084099 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j249d" Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.101905 4475 scope.go:117] "RemoveContainer" containerID="01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88" Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.122202 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j249d"] Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.127024 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j249d"] Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.130398 4475 scope.go:117] "RemoveContainer" containerID="21f7bd435e57e970067831fd580a9d2a331378eea7084d5b5905818686bbef1d" Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.160173 4475 scope.go:117] "RemoveContainer" containerID="4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb" Dec 03 08:12:30 crc kubenswrapper[4475]: E1203 08:12:30.163936 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb\": container with ID starting with 4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb not found: ID does not exist" containerID="4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb" Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.164009 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb"} err="failed to get container status \"4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb\": rpc error: code = NotFound desc = could not find container \"4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb\": container with ID starting with 4d69c40ec0919d55f484fd5e22f91794f3aec921558412626f2edef265c250eb not found: ID does not exist" Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.164040 4475 scope.go:117] "RemoveContainer" containerID="01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88" Dec 03 08:12:30 crc kubenswrapper[4475]: E1203 08:12:30.164389 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88\": container with ID starting with 01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88 not found: ID does not exist" containerID="01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88" Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.164480 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88"} err="failed to get container status \"01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88\": rpc error: code = NotFound desc = could not find container \"01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88\": container with ID starting with 01280767b0743f7181c32c8d7a3e2400b41fe7177b8eddededf3bd04f44f2c88 not found: ID does not exist" Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.164514 4475 scope.go:117] "RemoveContainer" containerID="21f7bd435e57e970067831fd580a9d2a331378eea7084d5b5905818686bbef1d" Dec 03 08:12:30 crc kubenswrapper[4475]: E1203 08:12:30.164806 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21f7bd435e57e970067831fd580a9d2a331378eea7084d5b5905818686bbef1d\": container with ID starting with 21f7bd435e57e970067831fd580a9d2a331378eea7084d5b5905818686bbef1d not found: ID does not exist" containerID="21f7bd435e57e970067831fd580a9d2a331378eea7084d5b5905818686bbef1d" Dec 03 08:12:30 crc kubenswrapper[4475]: I1203 08:12:30.164846 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21f7bd435e57e970067831fd580a9d2a331378eea7084d5b5905818686bbef1d"} err="failed to get container status \"21f7bd435e57e970067831fd580a9d2a331378eea7084d5b5905818686bbef1d\": rpc error: code = NotFound desc = could not find container \"21f7bd435e57e970067831fd580a9d2a331378eea7084d5b5905818686bbef1d\": container with ID starting with 21f7bd435e57e970067831fd580a9d2a331378eea7084d5b5905818686bbef1d not found: ID does not exist" Dec 03 08:12:31 crc kubenswrapper[4475]: I1203 08:12:31.501941 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e92612e9-991d-44e2-9670-d8a7d97064e5" path="/var/lib/kubelet/pods/e92612e9-991d-44e2-9670-d8a7d97064e5/volumes" Dec 03 08:12:58 crc kubenswrapper[4475]: I1203 08:12:58.934158 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:12:58 crc kubenswrapper[4475]: I1203 08:12:58.934990 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:13:28 crc kubenswrapper[4475]: I1203 08:13:28.933292 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:13:28 crc kubenswrapper[4475]: I1203 08:13:28.933876 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:13:28 crc kubenswrapper[4475]: I1203 08:13:28.933919 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 08:13:28 crc kubenswrapper[4475]: I1203 08:13:28.934575 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:13:28 crc kubenswrapper[4475]: I1203 08:13:28.934631 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" gracePeriod=600 Dec 03 08:13:29 crc kubenswrapper[4475]: E1203 08:13:29.063462 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:13:29 crc kubenswrapper[4475]: I1203 08:13:29.655068 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" exitCode=0 Dec 03 08:13:29 crc kubenswrapper[4475]: I1203 08:13:29.655154 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052"} Dec 03 08:13:29 crc kubenswrapper[4475]: I1203 08:13:29.655217 4475 scope.go:117] "RemoveContainer" containerID="231211e29e49928b9fe859fe2cab1239bf3f55652cc1b763bc66b1b5957606a5" Dec 03 08:13:29 crc kubenswrapper[4475]: I1203 08:13:29.657504 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:13:29 crc kubenswrapper[4475]: E1203 08:13:29.658286 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:13:43 crc kubenswrapper[4475]: I1203 08:13:43.493813 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:13:43 crc kubenswrapper[4475]: E1203 08:13:43.494858 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:13:57 crc kubenswrapper[4475]: I1203 08:13:57.492097 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:13:57 crc kubenswrapper[4475]: E1203 08:13:57.493673 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:14:08 crc kubenswrapper[4475]: I1203 08:14:08.491529 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:14:08 crc kubenswrapper[4475]: E1203 08:14:08.492413 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:14:21 crc kubenswrapper[4475]: I1203 08:14:21.492111 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:14:21 crc kubenswrapper[4475]: E1203 08:14:21.493434 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:14:32 crc kubenswrapper[4475]: I1203 08:14:32.491610 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:14:32 crc kubenswrapper[4475]: E1203 08:14:32.492379 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:14:45 crc kubenswrapper[4475]: I1203 08:14:45.492008 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:14:45 crc kubenswrapper[4475]: E1203 08:14:45.492900 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:14:59 crc kubenswrapper[4475]: I1203 08:14:59.492053 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:14:59 crc kubenswrapper[4475]: E1203 08:14:59.493365 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.202363 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv"] Dec 03 08:15:00 crc kubenswrapper[4475]: E1203 08:15:00.203708 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" containerName="extract-utilities" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.203838 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" containerName="extract-utilities" Dec 03 08:15:00 crc kubenswrapper[4475]: E1203 08:15:00.203920 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.203975 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4475]: E1203 08:15:00.204061 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92612e9-991d-44e2-9670-d8a7d97064e5" containerName="extract-content" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.204111 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92612e9-991d-44e2-9670-d8a7d97064e5" containerName="extract-content" Dec 03 08:15:00 crc kubenswrapper[4475]: E1203 08:15:00.204179 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92612e9-991d-44e2-9670-d8a7d97064e5" containerName="extract-utilities" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.204247 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92612e9-991d-44e2-9670-d8a7d97064e5" containerName="extract-utilities" Dec 03 08:15:00 crc kubenswrapper[4475]: E1203 08:15:00.204310 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92612e9-991d-44e2-9670-d8a7d97064e5" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.204367 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92612e9-991d-44e2-9670-d8a7d97064e5" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4475]: E1203 08:15:00.204508 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" containerName="extract-content" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.204518 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" containerName="extract-content" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.204995 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92612e9-991d-44e2-9670-d8a7d97064e5" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.205081 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="48dd1bfb-8bef-414a-ab0e-233f3a1ca3ed" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.206072 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.216421 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.216428 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.224308 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv"] Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.245110 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7eaed4ee-f926-4fce-a220-dd941a464f6c-config-volume\") pod \"collect-profiles-29412495-vfswv\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.245255 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhdj2\" (UniqueName: \"kubernetes.io/projected/7eaed4ee-f926-4fce-a220-dd941a464f6c-kube-api-access-dhdj2\") pod \"collect-profiles-29412495-vfswv\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.245574 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7eaed4ee-f926-4fce-a220-dd941a464f6c-secret-volume\") pod \"collect-profiles-29412495-vfswv\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.348299 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7eaed4ee-f926-4fce-a220-dd941a464f6c-secret-volume\") pod \"collect-profiles-29412495-vfswv\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.349012 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7eaed4ee-f926-4fce-a220-dd941a464f6c-config-volume\") pod \"collect-profiles-29412495-vfswv\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.349319 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhdj2\" (UniqueName: \"kubernetes.io/projected/7eaed4ee-f926-4fce-a220-dd941a464f6c-kube-api-access-dhdj2\") pod \"collect-profiles-29412495-vfswv\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.349848 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7eaed4ee-f926-4fce-a220-dd941a464f6c-config-volume\") pod \"collect-profiles-29412495-vfswv\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.358238 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7eaed4ee-f926-4fce-a220-dd941a464f6c-secret-volume\") pod \"collect-profiles-29412495-vfswv\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.369703 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhdj2\" (UniqueName: \"kubernetes.io/projected/7eaed4ee-f926-4fce-a220-dd941a464f6c-kube-api-access-dhdj2\") pod \"collect-profiles-29412495-vfswv\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.527386 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:00 crc kubenswrapper[4475]: I1203 08:15:00.980824 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv"] Dec 03 08:15:01 crc kubenswrapper[4475]: I1203 08:15:01.479477 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" event={"ID":"7eaed4ee-f926-4fce-a220-dd941a464f6c","Type":"ContainerStarted","Data":"8936dc26750955769a4c771137e77ad6db46e9b8fc4f0b236b839242ae8d2a4f"} Dec 03 08:15:01 crc kubenswrapper[4475]: I1203 08:15:01.479809 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" event={"ID":"7eaed4ee-f926-4fce-a220-dd941a464f6c","Type":"ContainerStarted","Data":"189c173f65ca54404937c467c1cdf0069ef8abc67932a2805eca437428ea0b82"} Dec 03 08:15:01 crc kubenswrapper[4475]: I1203 08:15:01.496397 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" podStartSLOduration=1.496378713 podStartE2EDuration="1.496378713s" podCreationTimestamp="2025-12-03 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:15:01.492103706 +0000 UTC m=+5386.297002030" watchObservedRunningTime="2025-12-03 08:15:01.496378713 +0000 UTC m=+5386.301277047" Dec 03 08:15:02 crc kubenswrapper[4475]: I1203 08:15:02.492115 4475 generic.go:334] "Generic (PLEG): container finished" podID="7eaed4ee-f926-4fce-a220-dd941a464f6c" containerID="8936dc26750955769a4c771137e77ad6db46e9b8fc4f0b236b839242ae8d2a4f" exitCode=0 Dec 03 08:15:02 crc kubenswrapper[4475]: I1203 08:15:02.492510 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" event={"ID":"7eaed4ee-f926-4fce-a220-dd941a464f6c","Type":"ContainerDied","Data":"8936dc26750955769a4c771137e77ad6db46e9b8fc4f0b236b839242ae8d2a4f"} Dec 03 08:15:03 crc kubenswrapper[4475]: I1203 08:15:03.833570 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:03 crc kubenswrapper[4475]: I1203 08:15:03.951261 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7eaed4ee-f926-4fce-a220-dd941a464f6c-secret-volume\") pod \"7eaed4ee-f926-4fce-a220-dd941a464f6c\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " Dec 03 08:15:03 crc kubenswrapper[4475]: I1203 08:15:03.951328 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhdj2\" (UniqueName: \"kubernetes.io/projected/7eaed4ee-f926-4fce-a220-dd941a464f6c-kube-api-access-dhdj2\") pod \"7eaed4ee-f926-4fce-a220-dd941a464f6c\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " Dec 03 08:15:03 crc kubenswrapper[4475]: I1203 08:15:03.951363 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7eaed4ee-f926-4fce-a220-dd941a464f6c-config-volume\") pod \"7eaed4ee-f926-4fce-a220-dd941a464f6c\" (UID: \"7eaed4ee-f926-4fce-a220-dd941a464f6c\") " Dec 03 08:15:03 crc kubenswrapper[4475]: I1203 08:15:03.952751 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eaed4ee-f926-4fce-a220-dd941a464f6c-config-volume" (OuterVolumeSpecName: "config-volume") pod "7eaed4ee-f926-4fce-a220-dd941a464f6c" (UID: "7eaed4ee-f926-4fce-a220-dd941a464f6c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:15:03 crc kubenswrapper[4475]: I1203 08:15:03.958504 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eaed4ee-f926-4fce-a220-dd941a464f6c-kube-api-access-dhdj2" (OuterVolumeSpecName: "kube-api-access-dhdj2") pod "7eaed4ee-f926-4fce-a220-dd941a464f6c" (UID: "7eaed4ee-f926-4fce-a220-dd941a464f6c"). InnerVolumeSpecName "kube-api-access-dhdj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:15:03 crc kubenswrapper[4475]: I1203 08:15:03.958560 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eaed4ee-f926-4fce-a220-dd941a464f6c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7eaed4ee-f926-4fce-a220-dd941a464f6c" (UID: "7eaed4ee-f926-4fce-a220-dd941a464f6c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:15:04 crc kubenswrapper[4475]: I1203 08:15:04.055061 4475 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7eaed4ee-f926-4fce-a220-dd941a464f6c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:04 crc kubenswrapper[4475]: I1203 08:15:04.055106 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhdj2\" (UniqueName: \"kubernetes.io/projected/7eaed4ee-f926-4fce-a220-dd941a464f6c-kube-api-access-dhdj2\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:04 crc kubenswrapper[4475]: I1203 08:15:04.055118 4475 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7eaed4ee-f926-4fce-a220-dd941a464f6c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:04 crc kubenswrapper[4475]: I1203 08:15:04.513419 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" event={"ID":"7eaed4ee-f926-4fce-a220-dd941a464f6c","Type":"ContainerDied","Data":"189c173f65ca54404937c467c1cdf0069ef8abc67932a2805eca437428ea0b82"} Dec 03 08:15:04 crc kubenswrapper[4475]: I1203 08:15:04.513811 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="189c173f65ca54404937c467c1cdf0069ef8abc67932a2805eca437428ea0b82" Dec 03 08:15:04 crc kubenswrapper[4475]: I1203 08:15:04.513511 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv" Dec 03 08:15:04 crc kubenswrapper[4475]: I1203 08:15:04.576570 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk"] Dec 03 08:15:04 crc kubenswrapper[4475]: I1203 08:15:04.583200 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-bf2vk"] Dec 03 08:15:05 crc kubenswrapper[4475]: I1203 08:15:05.501821 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3095202a-352a-453a-b3e1-b6ecc8c3d661" path="/var/lib/kubelet/pods/3095202a-352a-453a-b3e1-b6ecc8c3d661/volumes" Dec 03 08:15:10 crc kubenswrapper[4475]: I1203 08:15:10.491332 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:15:10 crc kubenswrapper[4475]: E1203 08:15:10.491901 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:15:21 crc kubenswrapper[4475]: I1203 08:15:21.491102 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:15:21 crc kubenswrapper[4475]: E1203 08:15:21.491999 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:15:30 crc kubenswrapper[4475]: I1203 08:15:30.307001 4475 scope.go:117] "RemoveContainer" containerID="d9349ff08664ec4b7231ae1904ba88086190af5ba690c9565ea3aaba849993b8" Dec 03 08:15:33 crc kubenswrapper[4475]: I1203 08:15:33.491362 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:15:33 crc kubenswrapper[4475]: E1203 08:15:33.492338 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:15:45 crc kubenswrapper[4475]: I1203 08:15:45.499510 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:15:45 crc kubenswrapper[4475]: E1203 08:15:45.500853 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:15:56 crc kubenswrapper[4475]: I1203 08:15:56.492017 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:15:56 crc kubenswrapper[4475]: E1203 08:15:56.493659 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.067267 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vjvv7"] Dec 03 08:16:00 crc kubenswrapper[4475]: E1203 08:16:00.068128 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eaed4ee-f926-4fce-a220-dd941a464f6c" containerName="collect-profiles" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.068141 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eaed4ee-f926-4fce-a220-dd941a464f6c" containerName="collect-profiles" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.068337 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eaed4ee-f926-4fce-a220-dd941a464f6c" containerName="collect-profiles" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.069544 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.076692 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r7zx\" (UniqueName: \"kubernetes.io/projected/e3e91348-acb2-46f4-8e29-b770e1e4bd24-kube-api-access-9r7zx\") pod \"redhat-operators-vjvv7\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.076747 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-utilities\") pod \"redhat-operators-vjvv7\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.076778 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-catalog-content\") pod \"redhat-operators-vjvv7\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.083738 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vjvv7"] Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.178339 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r7zx\" (UniqueName: \"kubernetes.io/projected/e3e91348-acb2-46f4-8e29-b770e1e4bd24-kube-api-access-9r7zx\") pod \"redhat-operators-vjvv7\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.178617 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-utilities\") pod \"redhat-operators-vjvv7\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.178772 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-catalog-content\") pod \"redhat-operators-vjvv7\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.179259 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-catalog-content\") pod \"redhat-operators-vjvv7\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.179330 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-utilities\") pod \"redhat-operators-vjvv7\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.247140 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r7zx\" (UniqueName: \"kubernetes.io/projected/e3e91348-acb2-46f4-8e29-b770e1e4bd24-kube-api-access-9r7zx\") pod \"redhat-operators-vjvv7\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.385392 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:00 crc kubenswrapper[4475]: I1203 08:16:00.831009 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vjvv7"] Dec 03 08:16:01 crc kubenswrapper[4475]: I1203 08:16:01.044670 4475 generic.go:334] "Generic (PLEG): container finished" podID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerID="27087f224c6174f982ad584de3f370589f04d5db5477f5c5409b6b55502d784b" exitCode=0 Dec 03 08:16:01 crc kubenswrapper[4475]: I1203 08:16:01.044871 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjvv7" event={"ID":"e3e91348-acb2-46f4-8e29-b770e1e4bd24","Type":"ContainerDied","Data":"27087f224c6174f982ad584de3f370589f04d5db5477f5c5409b6b55502d784b"} Dec 03 08:16:01 crc kubenswrapper[4475]: I1203 08:16:01.044932 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjvv7" event={"ID":"e3e91348-acb2-46f4-8e29-b770e1e4bd24","Type":"ContainerStarted","Data":"48fcae15c0dd3230841547a3306766e8c7123aa829a0208f901083082d7b89cd"} Dec 03 08:16:03 crc kubenswrapper[4475]: I1203 08:16:03.067270 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjvv7" event={"ID":"e3e91348-acb2-46f4-8e29-b770e1e4bd24","Type":"ContainerStarted","Data":"238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a"} Dec 03 08:16:05 crc kubenswrapper[4475]: I1203 08:16:05.086119 4475 generic.go:334] "Generic (PLEG): container finished" podID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerID="238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a" exitCode=0 Dec 03 08:16:05 crc kubenswrapper[4475]: I1203 08:16:05.086345 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjvv7" event={"ID":"e3e91348-acb2-46f4-8e29-b770e1e4bd24","Type":"ContainerDied","Data":"238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a"} Dec 03 08:16:06 crc kubenswrapper[4475]: I1203 08:16:06.098320 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjvv7" event={"ID":"e3e91348-acb2-46f4-8e29-b770e1e4bd24","Type":"ContainerStarted","Data":"91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b"} Dec 03 08:16:06 crc kubenswrapper[4475]: I1203 08:16:06.119922 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vjvv7" podStartSLOduration=1.597975055 podStartE2EDuration="6.119901621s" podCreationTimestamp="2025-12-03 08:16:00 +0000 UTC" firstStartedPulling="2025-12-03 08:16:01.046198364 +0000 UTC m=+5445.851096698" lastFinishedPulling="2025-12-03 08:16:05.568124929 +0000 UTC m=+5450.373023264" observedRunningTime="2025-12-03 08:16:06.11484773 +0000 UTC m=+5450.919746064" watchObservedRunningTime="2025-12-03 08:16:06.119901621 +0000 UTC m=+5450.924799955" Dec 03 08:16:09 crc kubenswrapper[4475]: I1203 08:16:09.491672 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:16:09 crc kubenswrapper[4475]: E1203 08:16:09.493301 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:16:10 crc kubenswrapper[4475]: I1203 08:16:10.386515 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:10 crc kubenswrapper[4475]: I1203 08:16:10.386579 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:11 crc kubenswrapper[4475]: I1203 08:16:11.451432 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vjvv7" podUID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerName="registry-server" probeResult="failure" output=< Dec 03 08:16:11 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 08:16:11 crc kubenswrapper[4475]: > Dec 03 08:16:20 crc kubenswrapper[4475]: I1203 08:16:20.434312 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:20 crc kubenswrapper[4475]: I1203 08:16:20.474201 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:20 crc kubenswrapper[4475]: I1203 08:16:20.676027 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vjvv7"] Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.243144 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vjvv7" podUID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerName="registry-server" containerID="cri-o://91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b" gracePeriod=2 Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.491366 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:16:22 crc kubenswrapper[4475]: E1203 08:16:22.491662 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.739424 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.831562 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-utilities\") pod \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.831621 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-catalog-content\") pod \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.831660 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r7zx\" (UniqueName: \"kubernetes.io/projected/e3e91348-acb2-46f4-8e29-b770e1e4bd24-kube-api-access-9r7zx\") pod \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\" (UID: \"e3e91348-acb2-46f4-8e29-b770e1e4bd24\") " Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.832110 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-utilities" (OuterVolumeSpecName: "utilities") pod "e3e91348-acb2-46f4-8e29-b770e1e4bd24" (UID: "e3e91348-acb2-46f4-8e29-b770e1e4bd24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.833485 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.838048 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e91348-acb2-46f4-8e29-b770e1e4bd24-kube-api-access-9r7zx" (OuterVolumeSpecName: "kube-api-access-9r7zx") pod "e3e91348-acb2-46f4-8e29-b770e1e4bd24" (UID: "e3e91348-acb2-46f4-8e29-b770e1e4bd24"). InnerVolumeSpecName "kube-api-access-9r7zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.906862 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3e91348-acb2-46f4-8e29-b770e1e4bd24" (UID: "e3e91348-acb2-46f4-8e29-b770e1e4bd24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.935233 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e91348-acb2-46f4-8e29-b770e1e4bd24-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:22 crc kubenswrapper[4475]: I1203 08:16:22.935350 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r7zx\" (UniqueName: \"kubernetes.io/projected/e3e91348-acb2-46f4-8e29-b770e1e4bd24-kube-api-access-9r7zx\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.253727 4475 generic.go:334] "Generic (PLEG): container finished" podID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerID="91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b" exitCode=0 Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.253788 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjvv7" event={"ID":"e3e91348-acb2-46f4-8e29-b770e1e4bd24","Type":"ContainerDied","Data":"91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b"} Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.253846 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjvv7" event={"ID":"e3e91348-acb2-46f4-8e29-b770e1e4bd24","Type":"ContainerDied","Data":"48fcae15c0dd3230841547a3306766e8c7123aa829a0208f901083082d7b89cd"} Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.253872 4475 scope.go:117] "RemoveContainer" containerID="91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b" Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.255584 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjvv7" Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.274344 4475 scope.go:117] "RemoveContainer" containerID="238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a" Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.291844 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vjvv7"] Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.301714 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vjvv7"] Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.309317 4475 scope.go:117] "RemoveContainer" containerID="27087f224c6174f982ad584de3f370589f04d5db5477f5c5409b6b55502d784b" Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.347286 4475 scope.go:117] "RemoveContainer" containerID="91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b" Dec 03 08:16:23 crc kubenswrapper[4475]: E1203 08:16:23.347804 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b\": container with ID starting with 91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b not found: ID does not exist" containerID="91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b" Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.347834 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b"} err="failed to get container status \"91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b\": rpc error: code = NotFound desc = could not find container \"91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b\": container with ID starting with 91558571fdc19e371d9f3bc5d2c2d2954c3d20588aea9ce5b8f08351e556807b not found: ID does not exist" Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.347855 4475 scope.go:117] "RemoveContainer" containerID="238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a" Dec 03 08:16:23 crc kubenswrapper[4475]: E1203 08:16:23.348216 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a\": container with ID starting with 238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a not found: ID does not exist" containerID="238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a" Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.348236 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a"} err="failed to get container status \"238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a\": rpc error: code = NotFound desc = could not find container \"238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a\": container with ID starting with 238608a96cd6daf7a5f2d1979b90bcb1fcd5cd0561082e66a4a439d9bc163d6a not found: ID does not exist" Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.348250 4475 scope.go:117] "RemoveContainer" containerID="27087f224c6174f982ad584de3f370589f04d5db5477f5c5409b6b55502d784b" Dec 03 08:16:23 crc kubenswrapper[4475]: E1203 08:16:23.348739 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27087f224c6174f982ad584de3f370589f04d5db5477f5c5409b6b55502d784b\": container with ID starting with 27087f224c6174f982ad584de3f370589f04d5db5477f5c5409b6b55502d784b not found: ID does not exist" containerID="27087f224c6174f982ad584de3f370589f04d5db5477f5c5409b6b55502d784b" Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.348790 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27087f224c6174f982ad584de3f370589f04d5db5477f5c5409b6b55502d784b"} err="failed to get container status \"27087f224c6174f982ad584de3f370589f04d5db5477f5c5409b6b55502d784b\": rpc error: code = NotFound desc = could not find container \"27087f224c6174f982ad584de3f370589f04d5db5477f5c5409b6b55502d784b\": container with ID starting with 27087f224c6174f982ad584de3f370589f04d5db5477f5c5409b6b55502d784b not found: ID does not exist" Dec 03 08:16:23 crc kubenswrapper[4475]: I1203 08:16:23.502143 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" path="/var/lib/kubelet/pods/e3e91348-acb2-46f4-8e29-b770e1e4bd24/volumes" Dec 03 08:16:36 crc kubenswrapper[4475]: I1203 08:16:36.491506 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:16:36 crc kubenswrapper[4475]: E1203 08:16:36.492509 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.857727 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f7txz"] Dec 03 08:16:44 crc kubenswrapper[4475]: E1203 08:16:44.859138 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerName="extract-content" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.859156 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerName="extract-content" Dec 03 08:16:44 crc kubenswrapper[4475]: E1203 08:16:44.859226 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerName="registry-server" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.859233 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerName="registry-server" Dec 03 08:16:44 crc kubenswrapper[4475]: E1203 08:16:44.859276 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerName="extract-utilities" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.859284 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerName="extract-utilities" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.859619 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e91348-acb2-46f4-8e29-b770e1e4bd24" containerName="registry-server" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.862661 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.881278 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-catalog-content\") pod \"community-operators-f7txz\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.881637 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-utilities\") pod \"community-operators-f7txz\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.881782 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94fr8\" (UniqueName: \"kubernetes.io/projected/03713bcf-11ab-47ca-819e-d6693b9e4655-kube-api-access-94fr8\") pod \"community-operators-f7txz\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.887959 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7txz"] Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.983184 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-catalog-content\") pod \"community-operators-f7txz\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.983256 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-utilities\") pod \"community-operators-f7txz\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.983295 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94fr8\" (UniqueName: \"kubernetes.io/projected/03713bcf-11ab-47ca-819e-d6693b9e4655-kube-api-access-94fr8\") pod \"community-operators-f7txz\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.983833 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-utilities\") pod \"community-operators-f7txz\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:44 crc kubenswrapper[4475]: I1203 08:16:44.983914 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-catalog-content\") pod \"community-operators-f7txz\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:45 crc kubenswrapper[4475]: I1203 08:16:45.003113 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94fr8\" (UniqueName: \"kubernetes.io/projected/03713bcf-11ab-47ca-819e-d6693b9e4655-kube-api-access-94fr8\") pod \"community-operators-f7txz\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:45 crc kubenswrapper[4475]: I1203 08:16:45.183891 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:45 crc kubenswrapper[4475]: I1203 08:16:45.642644 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7txz"] Dec 03 08:16:46 crc kubenswrapper[4475]: I1203 08:16:46.486322 4475 generic.go:334] "Generic (PLEG): container finished" podID="03713bcf-11ab-47ca-819e-d6693b9e4655" containerID="da9b6c3bb968aedfa0e5e96dbcdabdbecc7aab27cee95bd4345ec1542d0535b6" exitCode=0 Dec 03 08:16:46 crc kubenswrapper[4475]: I1203 08:16:46.486624 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7txz" event={"ID":"03713bcf-11ab-47ca-819e-d6693b9e4655","Type":"ContainerDied","Data":"da9b6c3bb968aedfa0e5e96dbcdabdbecc7aab27cee95bd4345ec1542d0535b6"} Dec 03 08:16:46 crc kubenswrapper[4475]: I1203 08:16:46.486702 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7txz" event={"ID":"03713bcf-11ab-47ca-819e-d6693b9e4655","Type":"ContainerStarted","Data":"f27c734e33392f0ddbb09ea6ec132161971690213825f843a4baf5e2b327fa89"} Dec 03 08:16:47 crc kubenswrapper[4475]: I1203 08:16:47.502413 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7txz" event={"ID":"03713bcf-11ab-47ca-819e-d6693b9e4655","Type":"ContainerStarted","Data":"72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829"} Dec 03 08:16:48 crc kubenswrapper[4475]: I1203 08:16:48.491901 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:16:48 crc kubenswrapper[4475]: E1203 08:16:48.492494 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:16:49 crc kubenswrapper[4475]: I1203 08:16:49.516441 4475 generic.go:334] "Generic (PLEG): container finished" podID="03713bcf-11ab-47ca-819e-d6693b9e4655" containerID="72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829" exitCode=0 Dec 03 08:16:49 crc kubenswrapper[4475]: I1203 08:16:49.516489 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7txz" event={"ID":"03713bcf-11ab-47ca-819e-d6693b9e4655","Type":"ContainerDied","Data":"72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829"} Dec 03 08:16:50 crc kubenswrapper[4475]: I1203 08:16:50.528297 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7txz" event={"ID":"03713bcf-11ab-47ca-819e-d6693b9e4655","Type":"ContainerStarted","Data":"f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1"} Dec 03 08:16:50 crc kubenswrapper[4475]: I1203 08:16:50.546998 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f7txz" podStartSLOduration=3.000921171 podStartE2EDuration="6.546977612s" podCreationTimestamp="2025-12-03 08:16:44 +0000 UTC" firstStartedPulling="2025-12-03 08:16:46.491006291 +0000 UTC m=+5491.295904625" lastFinishedPulling="2025-12-03 08:16:50.037062732 +0000 UTC m=+5494.841961066" observedRunningTime="2025-12-03 08:16:50.545685252 +0000 UTC m=+5495.350583587" watchObservedRunningTime="2025-12-03 08:16:50.546977612 +0000 UTC m=+5495.351875946" Dec 03 08:16:55 crc kubenswrapper[4475]: I1203 08:16:55.186617 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:55 crc kubenswrapper[4475]: I1203 08:16:55.188321 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:55 crc kubenswrapper[4475]: I1203 08:16:55.332481 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:55 crc kubenswrapper[4475]: I1203 08:16:55.618819 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:55 crc kubenswrapper[4475]: I1203 08:16:55.665757 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7txz"] Dec 03 08:16:57 crc kubenswrapper[4475]: I1203 08:16:57.589735 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f7txz" podUID="03713bcf-11ab-47ca-819e-d6693b9e4655" containerName="registry-server" containerID="cri-o://f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1" gracePeriod=2 Dec 03 08:16:57 crc kubenswrapper[4475]: E1203 08:16:57.902781 4475 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.087313 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.218703 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94fr8\" (UniqueName: \"kubernetes.io/projected/03713bcf-11ab-47ca-819e-d6693b9e4655-kube-api-access-94fr8\") pod \"03713bcf-11ab-47ca-819e-d6693b9e4655\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.218788 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-catalog-content\") pod \"03713bcf-11ab-47ca-819e-d6693b9e4655\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.218838 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-utilities\") pod \"03713bcf-11ab-47ca-819e-d6693b9e4655\" (UID: \"03713bcf-11ab-47ca-819e-d6693b9e4655\") " Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.219893 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-utilities" (OuterVolumeSpecName: "utilities") pod "03713bcf-11ab-47ca-819e-d6693b9e4655" (UID: "03713bcf-11ab-47ca-819e-d6693b9e4655"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.226973 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03713bcf-11ab-47ca-819e-d6693b9e4655-kube-api-access-94fr8" (OuterVolumeSpecName: "kube-api-access-94fr8") pod "03713bcf-11ab-47ca-819e-d6693b9e4655" (UID: "03713bcf-11ab-47ca-819e-d6693b9e4655"). InnerVolumeSpecName "kube-api-access-94fr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.262513 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03713bcf-11ab-47ca-819e-d6693b9e4655" (UID: "03713bcf-11ab-47ca-819e-d6693b9e4655"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.323106 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94fr8\" (UniqueName: \"kubernetes.io/projected/03713bcf-11ab-47ca-819e-d6693b9e4655-kube-api-access-94fr8\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.323143 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.323170 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03713bcf-11ab-47ca-819e-d6693b9e4655-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.603925 4475 generic.go:334] "Generic (PLEG): container finished" podID="03713bcf-11ab-47ca-819e-d6693b9e4655" containerID="f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1" exitCode=0 Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.603988 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7txz" event={"ID":"03713bcf-11ab-47ca-819e-d6693b9e4655","Type":"ContainerDied","Data":"f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1"} Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.604026 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7txz" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.604051 4475 scope.go:117] "RemoveContainer" containerID="f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.604035 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7txz" event={"ID":"03713bcf-11ab-47ca-819e-d6693b9e4655","Type":"ContainerDied","Data":"f27c734e33392f0ddbb09ea6ec132161971690213825f843a4baf5e2b327fa89"} Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.634228 4475 scope.go:117] "RemoveContainer" containerID="72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.650127 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7txz"] Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.658409 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f7txz"] Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.659991 4475 scope.go:117] "RemoveContainer" containerID="da9b6c3bb968aedfa0e5e96dbcdabdbecc7aab27cee95bd4345ec1542d0535b6" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.699738 4475 scope.go:117] "RemoveContainer" containerID="f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1" Dec 03 08:16:58 crc kubenswrapper[4475]: E1203 08:16:58.700477 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1\": container with ID starting with f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1 not found: ID does not exist" containerID="f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.700531 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1"} err="failed to get container status \"f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1\": rpc error: code = NotFound desc = could not find container \"f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1\": container with ID starting with f600fa54f556454f08355fb6a1e6525b363adf48e3e8c64413eb3f105b3456d1 not found: ID does not exist" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.700565 4475 scope.go:117] "RemoveContainer" containerID="72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829" Dec 03 08:16:58 crc kubenswrapper[4475]: E1203 08:16:58.700943 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829\": container with ID starting with 72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829 not found: ID does not exist" containerID="72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.700968 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829"} err="failed to get container status \"72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829\": rpc error: code = NotFound desc = could not find container \"72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829\": container with ID starting with 72f7010a5cd3cfbd72451ce86d234ebf63846f9b5c87da59f45fa1e726c52829 not found: ID does not exist" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.700985 4475 scope.go:117] "RemoveContainer" containerID="da9b6c3bb968aedfa0e5e96dbcdabdbecc7aab27cee95bd4345ec1542d0535b6" Dec 03 08:16:58 crc kubenswrapper[4475]: E1203 08:16:58.701437 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da9b6c3bb968aedfa0e5e96dbcdabdbecc7aab27cee95bd4345ec1542d0535b6\": container with ID starting with da9b6c3bb968aedfa0e5e96dbcdabdbecc7aab27cee95bd4345ec1542d0535b6 not found: ID does not exist" containerID="da9b6c3bb968aedfa0e5e96dbcdabdbecc7aab27cee95bd4345ec1542d0535b6" Dec 03 08:16:58 crc kubenswrapper[4475]: I1203 08:16:58.701493 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da9b6c3bb968aedfa0e5e96dbcdabdbecc7aab27cee95bd4345ec1542d0535b6"} err="failed to get container status \"da9b6c3bb968aedfa0e5e96dbcdabdbecc7aab27cee95bd4345ec1542d0535b6\": rpc error: code = NotFound desc = could not find container \"da9b6c3bb968aedfa0e5e96dbcdabdbecc7aab27cee95bd4345ec1542d0535b6\": container with ID starting with da9b6c3bb968aedfa0e5e96dbcdabdbecc7aab27cee95bd4345ec1542d0535b6 not found: ID does not exist" Dec 03 08:16:59 crc kubenswrapper[4475]: I1203 08:16:59.513881 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03713bcf-11ab-47ca-819e-d6693b9e4655" path="/var/lib/kubelet/pods/03713bcf-11ab-47ca-819e-d6693b9e4655/volumes" Dec 03 08:17:03 crc kubenswrapper[4475]: I1203 08:17:03.492659 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:17:03 crc kubenswrapper[4475]: E1203 08:17:03.493623 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:17:08 crc kubenswrapper[4475]: E1203 08:17:08.457748 4475 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.177:59112->192.168.25.177:40263: read tcp 192.168.25.177:59112->192.168.25.177:40263: read: connection reset by peer Dec 03 08:17:14 crc kubenswrapper[4475]: I1203 08:17:14.491790 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:17:14 crc kubenswrapper[4475]: E1203 08:17:14.492596 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:17:29 crc kubenswrapper[4475]: I1203 08:17:29.492365 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:17:29 crc kubenswrapper[4475]: E1203 08:17:29.493195 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:17:44 crc kubenswrapper[4475]: I1203 08:17:44.491448 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:17:44 crc kubenswrapper[4475]: E1203 08:17:44.493712 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:17:57 crc kubenswrapper[4475]: I1203 08:17:57.491818 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:17:57 crc kubenswrapper[4475]: E1203 08:17:57.493643 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:18:09 crc kubenswrapper[4475]: I1203 08:18:09.492105 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:18:09 crc kubenswrapper[4475]: E1203 08:18:09.493183 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:18:20 crc kubenswrapper[4475]: I1203 08:18:20.491812 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:18:20 crc kubenswrapper[4475]: E1203 08:18:20.492991 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:18:32 crc kubenswrapper[4475]: I1203 08:18:32.492166 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:18:33 crc kubenswrapper[4475]: I1203 08:18:33.533801 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"5c0490ae400c9513b0649abb7be111a32bf5b73b716e1c806a1c2404339d3253"} Dec 03 08:20:58 crc kubenswrapper[4475]: I1203 08:20:58.934005 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:20:58 crc kubenswrapper[4475]: I1203 08:20:58.935497 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:21:28 crc kubenswrapper[4475]: I1203 08:21:28.933797 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:21:28 crc kubenswrapper[4475]: I1203 08:21:28.934433 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:21:58 crc kubenswrapper[4475]: I1203 08:21:58.933744 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:21:58 crc kubenswrapper[4475]: I1203 08:21:58.934203 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:21:58 crc kubenswrapper[4475]: I1203 08:21:58.934247 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 08:21:58 crc kubenswrapper[4475]: I1203 08:21:58.935107 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c0490ae400c9513b0649abb7be111a32bf5b73b716e1c806a1c2404339d3253"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:21:58 crc kubenswrapper[4475]: I1203 08:21:58.935166 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://5c0490ae400c9513b0649abb7be111a32bf5b73b716e1c806a1c2404339d3253" gracePeriod=600 Dec 03 08:21:59 crc kubenswrapper[4475]: I1203 08:21:59.426758 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="5c0490ae400c9513b0649abb7be111a32bf5b73b716e1c806a1c2404339d3253" exitCode=0 Dec 03 08:21:59 crc kubenswrapper[4475]: I1203 08:21:59.426835 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"5c0490ae400c9513b0649abb7be111a32bf5b73b716e1c806a1c2404339d3253"} Dec 03 08:21:59 crc kubenswrapper[4475]: I1203 08:21:59.426993 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b"} Dec 03 08:21:59 crc kubenswrapper[4475]: I1203 08:21:59.427017 4475 scope.go:117] "RemoveContainer" containerID="292f72da4b193a86cc630250466734e7dc7d208705ab2998ede991d595170052" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.282971 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jqwk8"] Dec 03 08:23:20 crc kubenswrapper[4475]: E1203 08:23:20.283769 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03713bcf-11ab-47ca-819e-d6693b9e4655" containerName="extract-utilities" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.283783 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="03713bcf-11ab-47ca-819e-d6693b9e4655" containerName="extract-utilities" Dec 03 08:23:20 crc kubenswrapper[4475]: E1203 08:23:20.283828 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03713bcf-11ab-47ca-819e-d6693b9e4655" containerName="registry-server" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.283834 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="03713bcf-11ab-47ca-819e-d6693b9e4655" containerName="registry-server" Dec 03 08:23:20 crc kubenswrapper[4475]: E1203 08:23:20.283847 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03713bcf-11ab-47ca-819e-d6693b9e4655" containerName="extract-content" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.283853 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="03713bcf-11ab-47ca-819e-d6693b9e4655" containerName="extract-content" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.284048 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="03713bcf-11ab-47ca-819e-d6693b9e4655" containerName="registry-server" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.285472 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.298174 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqwk8"] Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.330339 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7d7s\" (UniqueName: \"kubernetes.io/projected/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-kube-api-access-f7d7s\") pod \"redhat-marketplace-jqwk8\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.330518 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-catalog-content\") pod \"redhat-marketplace-jqwk8\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.330651 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-utilities\") pod \"redhat-marketplace-jqwk8\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.433269 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-utilities\") pod \"redhat-marketplace-jqwk8\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.433468 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7d7s\" (UniqueName: \"kubernetes.io/projected/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-kube-api-access-f7d7s\") pod \"redhat-marketplace-jqwk8\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.433607 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-catalog-content\") pod \"redhat-marketplace-jqwk8\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.434560 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-utilities\") pod \"redhat-marketplace-jqwk8\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.434655 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-catalog-content\") pod \"redhat-marketplace-jqwk8\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.452736 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7d7s\" (UniqueName: \"kubernetes.io/projected/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-kube-api-access-f7d7s\") pod \"redhat-marketplace-jqwk8\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:20 crc kubenswrapper[4475]: I1203 08:23:20.605159 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:21 crc kubenswrapper[4475]: I1203 08:23:21.235829 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqwk8"] Dec 03 08:23:22 crc kubenswrapper[4475]: I1203 08:23:22.151734 4475 generic.go:334] "Generic (PLEG): container finished" podID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" containerID="222934c5d72abadae834470ca58fa99b1bc7fda8c64442d959b549213dab5747" exitCode=0 Dec 03 08:23:22 crc kubenswrapper[4475]: I1203 08:23:22.151964 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqwk8" event={"ID":"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b","Type":"ContainerDied","Data":"222934c5d72abadae834470ca58fa99b1bc7fda8c64442d959b549213dab5747"} Dec 03 08:23:22 crc kubenswrapper[4475]: I1203 08:23:22.152020 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqwk8" event={"ID":"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b","Type":"ContainerStarted","Data":"357fcd9ffe860bfe64d2231f81726fc51b7259c212605f1d24b4e9c7e191963a"} Dec 03 08:23:22 crc kubenswrapper[4475]: I1203 08:23:22.154360 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:23:24 crc kubenswrapper[4475]: I1203 08:23:24.179184 4475 generic.go:334] "Generic (PLEG): container finished" podID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" containerID="55ca9e027ab3f9c6fab4a029c271b3a6e797a160affdfd2df48ea70b69c26141" exitCode=0 Dec 03 08:23:24 crc kubenswrapper[4475]: I1203 08:23:24.179261 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqwk8" event={"ID":"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b","Type":"ContainerDied","Data":"55ca9e027ab3f9c6fab4a029c271b3a6e797a160affdfd2df48ea70b69c26141"} Dec 03 08:23:24 crc kubenswrapper[4475]: E1203 08:23:24.481815 4475 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.177:38334->192.168.25.177:40263: write tcp 192.168.25.177:38334->192.168.25.177:40263: write: connection reset by peer Dec 03 08:23:25 crc kubenswrapper[4475]: I1203 08:23:25.194164 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqwk8" event={"ID":"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b","Type":"ContainerStarted","Data":"f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2"} Dec 03 08:23:25 crc kubenswrapper[4475]: I1203 08:23:25.218478 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jqwk8" podStartSLOduration=2.685663589 podStartE2EDuration="5.218445852s" podCreationTimestamp="2025-12-03 08:23:20 +0000 UTC" firstStartedPulling="2025-12-03 08:23:22.153325291 +0000 UTC m=+5886.958223625" lastFinishedPulling="2025-12-03 08:23:24.686107564 +0000 UTC m=+5889.491005888" observedRunningTime="2025-12-03 08:23:25.21051406 +0000 UTC m=+5890.015412404" watchObservedRunningTime="2025-12-03 08:23:25.218445852 +0000 UTC m=+5890.023344186" Dec 03 08:23:30 crc kubenswrapper[4475]: I1203 08:23:30.606568 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:30 crc kubenswrapper[4475]: I1203 08:23:30.606937 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:30 crc kubenswrapper[4475]: I1203 08:23:30.660440 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:31 crc kubenswrapper[4475]: I1203 08:23:31.279356 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:31 crc kubenswrapper[4475]: I1203 08:23:31.332756 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqwk8"] Dec 03 08:23:33 crc kubenswrapper[4475]: I1203 08:23:33.261205 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jqwk8" podUID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" containerName="registry-server" containerID="cri-o://f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2" gracePeriod=2 Dec 03 08:23:33 crc kubenswrapper[4475]: I1203 08:23:33.796189 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:33 crc kubenswrapper[4475]: I1203 08:23:33.921717 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-utilities\") pod \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " Dec 03 08:23:33 crc kubenswrapper[4475]: I1203 08:23:33.921886 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7d7s\" (UniqueName: \"kubernetes.io/projected/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-kube-api-access-f7d7s\") pod \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " Dec 03 08:23:33 crc kubenswrapper[4475]: I1203 08:23:33.921947 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-catalog-content\") pod \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\" (UID: \"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b\") " Dec 03 08:23:33 crc kubenswrapper[4475]: I1203 08:23:33.922444 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-utilities" (OuterVolumeSpecName: "utilities") pod "7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" (UID: "7e00e7eb-72e6-4a52-93cc-8f76160a4f3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:23:33 crc kubenswrapper[4475]: I1203 08:23:33.939746 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" (UID: "7e00e7eb-72e6-4a52-93cc-8f76160a4f3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:23:33 crc kubenswrapper[4475]: I1203 08:23:33.940393 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-kube-api-access-f7d7s" (OuterVolumeSpecName: "kube-api-access-f7d7s") pod "7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" (UID: "7e00e7eb-72e6-4a52-93cc-8f76160a4f3b"). InnerVolumeSpecName "kube-api-access-f7d7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.025300 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7d7s\" (UniqueName: \"kubernetes.io/projected/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-kube-api-access-f7d7s\") on node \"crc\" DevicePath \"\"" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.025328 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.025340 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.276283 4475 generic.go:334] "Generic (PLEG): container finished" podID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" containerID="f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2" exitCode=0 Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.276558 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqwk8" event={"ID":"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b","Type":"ContainerDied","Data":"f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2"} Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.276594 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqwk8" event={"ID":"7e00e7eb-72e6-4a52-93cc-8f76160a4f3b","Type":"ContainerDied","Data":"357fcd9ffe860bfe64d2231f81726fc51b7259c212605f1d24b4e9c7e191963a"} Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.276615 4475 scope.go:117] "RemoveContainer" containerID="f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.276795 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqwk8" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.309346 4475 scope.go:117] "RemoveContainer" containerID="55ca9e027ab3f9c6fab4a029c271b3a6e797a160affdfd2df48ea70b69c26141" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.318207 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqwk8"] Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.326564 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqwk8"] Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.338854 4475 scope.go:117] "RemoveContainer" containerID="222934c5d72abadae834470ca58fa99b1bc7fda8c64442d959b549213dab5747" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.373601 4475 scope.go:117] "RemoveContainer" containerID="f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2" Dec 03 08:23:34 crc kubenswrapper[4475]: E1203 08:23:34.375510 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2\": container with ID starting with f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2 not found: ID does not exist" containerID="f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.375548 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2"} err="failed to get container status \"f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2\": rpc error: code = NotFound desc = could not find container \"f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2\": container with ID starting with f5871afa30b4591d19dbd15bf2df8bd03f615f9e8d641fe14173c99bb1f342d2 not found: ID does not exist" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.375574 4475 scope.go:117] "RemoveContainer" containerID="55ca9e027ab3f9c6fab4a029c271b3a6e797a160affdfd2df48ea70b69c26141" Dec 03 08:23:34 crc kubenswrapper[4475]: E1203 08:23:34.376104 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55ca9e027ab3f9c6fab4a029c271b3a6e797a160affdfd2df48ea70b69c26141\": container with ID starting with 55ca9e027ab3f9c6fab4a029c271b3a6e797a160affdfd2df48ea70b69c26141 not found: ID does not exist" containerID="55ca9e027ab3f9c6fab4a029c271b3a6e797a160affdfd2df48ea70b69c26141" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.376179 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ca9e027ab3f9c6fab4a029c271b3a6e797a160affdfd2df48ea70b69c26141"} err="failed to get container status \"55ca9e027ab3f9c6fab4a029c271b3a6e797a160affdfd2df48ea70b69c26141\": rpc error: code = NotFound desc = could not find container \"55ca9e027ab3f9c6fab4a029c271b3a6e797a160affdfd2df48ea70b69c26141\": container with ID starting with 55ca9e027ab3f9c6fab4a029c271b3a6e797a160affdfd2df48ea70b69c26141 not found: ID does not exist" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.376216 4475 scope.go:117] "RemoveContainer" containerID="222934c5d72abadae834470ca58fa99b1bc7fda8c64442d959b549213dab5747" Dec 03 08:23:34 crc kubenswrapper[4475]: E1203 08:23:34.376623 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222934c5d72abadae834470ca58fa99b1bc7fda8c64442d959b549213dab5747\": container with ID starting with 222934c5d72abadae834470ca58fa99b1bc7fda8c64442d959b549213dab5747 not found: ID does not exist" containerID="222934c5d72abadae834470ca58fa99b1bc7fda8c64442d959b549213dab5747" Dec 03 08:23:34 crc kubenswrapper[4475]: I1203 08:23:34.376652 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222934c5d72abadae834470ca58fa99b1bc7fda8c64442d959b549213dab5747"} err="failed to get container status \"222934c5d72abadae834470ca58fa99b1bc7fda8c64442d959b549213dab5747\": rpc error: code = NotFound desc = could not find container \"222934c5d72abadae834470ca58fa99b1bc7fda8c64442d959b549213dab5747\": container with ID starting with 222934c5d72abadae834470ca58fa99b1bc7fda8c64442d959b549213dab5747 not found: ID does not exist" Dec 03 08:23:34 crc kubenswrapper[4475]: E1203 08:23:34.497575 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e00e7eb_72e6_4a52_93cc_8f76160a4f3b.slice\": RecentStats: unable to find data in memory cache]" Dec 03 08:23:35 crc kubenswrapper[4475]: I1203 08:23:35.502271 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" path="/var/lib/kubelet/pods/7e00e7eb-72e6-4a52-93cc-8f76160a4f3b/volumes" Dec 03 08:24:28 crc kubenswrapper[4475]: I1203 08:24:28.933690 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:24:28 crc kubenswrapper[4475]: I1203 08:24:28.934373 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:24:58 crc kubenswrapper[4475]: I1203 08:24:58.933250 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:24:58 crc kubenswrapper[4475]: I1203 08:24:58.933818 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:25:28 crc kubenswrapper[4475]: I1203 08:25:28.933800 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:25:28 crc kubenswrapper[4475]: I1203 08:25:28.934263 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:25:28 crc kubenswrapper[4475]: I1203 08:25:28.934308 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 08:25:28 crc kubenswrapper[4475]: I1203 08:25:28.935213 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:25:28 crc kubenswrapper[4475]: I1203 08:25:28.935274 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" gracePeriod=600 Dec 03 08:25:29 crc kubenswrapper[4475]: E1203 08:25:29.053411 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:25:29 crc kubenswrapper[4475]: I1203 08:25:29.340110 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" exitCode=0 Dec 03 08:25:29 crc kubenswrapper[4475]: I1203 08:25:29.340167 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b"} Dec 03 08:25:29 crc kubenswrapper[4475]: I1203 08:25:29.340221 4475 scope.go:117] "RemoveContainer" containerID="5c0490ae400c9513b0649abb7be111a32bf5b73b716e1c806a1c2404339d3253" Dec 03 08:25:29 crc kubenswrapper[4475]: I1203 08:25:29.341041 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:25:29 crc kubenswrapper[4475]: E1203 08:25:29.341660 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:25:42 crc kubenswrapper[4475]: I1203 08:25:42.492054 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:25:42 crc kubenswrapper[4475]: E1203 08:25:42.492898 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:25:54 crc kubenswrapper[4475]: I1203 08:25:54.491501 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:25:54 crc kubenswrapper[4475]: E1203 08:25:54.492096 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:26:09 crc kubenswrapper[4475]: I1203 08:26:09.490995 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:26:09 crc kubenswrapper[4475]: E1203 08:26:09.491677 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:26:22 crc kubenswrapper[4475]: I1203 08:26:22.491088 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:26:22 crc kubenswrapper[4475]: E1203 08:26:22.492590 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.600314 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b6926"] Dec 03 08:26:25 crc kubenswrapper[4475]: E1203 08:26:25.600951 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" containerName="extract-utilities" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.601168 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" containerName="extract-utilities" Dec 03 08:26:25 crc kubenswrapper[4475]: E1203 08:26:25.601189 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" containerName="registry-server" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.601194 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" containerName="registry-server" Dec 03 08:26:25 crc kubenswrapper[4475]: E1203 08:26:25.601208 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" containerName="extract-content" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.601213 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" containerName="extract-content" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.601473 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e00e7eb-72e6-4a52-93cc-8f76160a4f3b" containerName="registry-server" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.602712 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.618059 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6926"] Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.719762 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wft9m\" (UniqueName: \"kubernetes.io/projected/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-kube-api-access-wft9m\") pod \"certified-operators-b6926\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.719871 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-utilities\") pod \"certified-operators-b6926\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.719905 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-catalog-content\") pod \"certified-operators-b6926\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.822645 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wft9m\" (UniqueName: \"kubernetes.io/projected/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-kube-api-access-wft9m\") pod \"certified-operators-b6926\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.823140 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-utilities\") pod \"certified-operators-b6926\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.823320 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-catalog-content\") pod \"certified-operators-b6926\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.823786 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-utilities\") pod \"certified-operators-b6926\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.823828 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-catalog-content\") pod \"certified-operators-b6926\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.843241 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wft9m\" (UniqueName: \"kubernetes.io/projected/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-kube-api-access-wft9m\") pod \"certified-operators-b6926\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:25 crc kubenswrapper[4475]: I1203 08:26:25.939626 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:26 crc kubenswrapper[4475]: I1203 08:26:26.425066 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6926"] Dec 03 08:26:26 crc kubenswrapper[4475]: I1203 08:26:26.765335 4475 generic.go:334] "Generic (PLEG): container finished" podID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerID="7572f88b0f9682923a4b976dfc8959b766f4981d33f75df8126945e272e1f098" exitCode=0 Dec 03 08:26:26 crc kubenswrapper[4475]: I1203 08:26:26.765526 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6926" event={"ID":"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe","Type":"ContainerDied","Data":"7572f88b0f9682923a4b976dfc8959b766f4981d33f75df8126945e272e1f098"} Dec 03 08:26:26 crc kubenswrapper[4475]: I1203 08:26:26.765776 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6926" event={"ID":"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe","Type":"ContainerStarted","Data":"176d2eaab23e27381bbcf3a12f390f15564c1e3148f0ca4b1b81d60f2575f31b"} Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.431833 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wgq5f"] Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.435268 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.462683 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgq5f"] Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.489537 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m56nw\" (UniqueName: \"kubernetes.io/projected/044b0d1b-a43f-477d-b7c9-a229a91712ac-kube-api-access-m56nw\") pod \"redhat-operators-wgq5f\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.489700 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-catalog-content\") pod \"redhat-operators-wgq5f\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.489860 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-utilities\") pod \"redhat-operators-wgq5f\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.591865 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m56nw\" (UniqueName: \"kubernetes.io/projected/044b0d1b-a43f-477d-b7c9-a229a91712ac-kube-api-access-m56nw\") pod \"redhat-operators-wgq5f\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.591965 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-catalog-content\") pod \"redhat-operators-wgq5f\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.592082 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-utilities\") pod \"redhat-operators-wgq5f\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.593309 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-catalog-content\") pod \"redhat-operators-wgq5f\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.594321 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-utilities\") pod \"redhat-operators-wgq5f\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.612733 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m56nw\" (UniqueName: \"kubernetes.io/projected/044b0d1b-a43f-477d-b7c9-a229a91712ac-kube-api-access-m56nw\") pod \"redhat-operators-wgq5f\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.762870 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:28 crc kubenswrapper[4475]: I1203 08:26:28.799213 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6926" event={"ID":"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe","Type":"ContainerStarted","Data":"51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c"} Dec 03 08:26:29 crc kubenswrapper[4475]: I1203 08:26:29.319895 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgq5f"] Dec 03 08:26:29 crc kubenswrapper[4475]: W1203 08:26:29.329521 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod044b0d1b_a43f_477d_b7c9_a229a91712ac.slice/crio-58002f14c2ef7ed28c1db19f179585a54d0d203dd81997a870dab8a8e1709426 WatchSource:0}: Error finding container 58002f14c2ef7ed28c1db19f179585a54d0d203dd81997a870dab8a8e1709426: Status 404 returned error can't find the container with id 58002f14c2ef7ed28c1db19f179585a54d0d203dd81997a870dab8a8e1709426 Dec 03 08:26:29 crc kubenswrapper[4475]: I1203 08:26:29.811273 4475 generic.go:334] "Generic (PLEG): container finished" podID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerID="dbaeebfd2fcd25185b57bcb498e0103f1c1f6e5ae78bde3ab5a9f0bc1ae9aac9" exitCode=0 Dec 03 08:26:29 crc kubenswrapper[4475]: I1203 08:26:29.811324 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgq5f" event={"ID":"044b0d1b-a43f-477d-b7c9-a229a91712ac","Type":"ContainerDied","Data":"dbaeebfd2fcd25185b57bcb498e0103f1c1f6e5ae78bde3ab5a9f0bc1ae9aac9"} Dec 03 08:26:29 crc kubenswrapper[4475]: I1203 08:26:29.811395 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgq5f" event={"ID":"044b0d1b-a43f-477d-b7c9-a229a91712ac","Type":"ContainerStarted","Data":"58002f14c2ef7ed28c1db19f179585a54d0d203dd81997a870dab8a8e1709426"} Dec 03 08:26:29 crc kubenswrapper[4475]: I1203 08:26:29.813815 4475 generic.go:334] "Generic (PLEG): container finished" podID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerID="51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c" exitCode=0 Dec 03 08:26:29 crc kubenswrapper[4475]: I1203 08:26:29.813865 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6926" event={"ID":"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe","Type":"ContainerDied","Data":"51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c"} Dec 03 08:26:30 crc kubenswrapper[4475]: I1203 08:26:30.846878 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6926" event={"ID":"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe","Type":"ContainerStarted","Data":"934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da"} Dec 03 08:26:30 crc kubenswrapper[4475]: I1203 08:26:30.849959 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgq5f" event={"ID":"044b0d1b-a43f-477d-b7c9-a229a91712ac","Type":"ContainerStarted","Data":"510faa6da5efc0c1d6a522cc1828ae2e805e6cb679b80b1948656ce2baf9328f"} Dec 03 08:26:30 crc kubenswrapper[4475]: I1203 08:26:30.872748 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b6926" podStartSLOduration=2.189129561 podStartE2EDuration="5.87273011s" podCreationTimestamp="2025-12-03 08:26:25 +0000 UTC" firstStartedPulling="2025-12-03 08:26:26.769804233 +0000 UTC m=+6071.574702567" lastFinishedPulling="2025-12-03 08:26:30.453404782 +0000 UTC m=+6075.258303116" observedRunningTime="2025-12-03 08:26:30.869844616 +0000 UTC m=+6075.674742949" watchObservedRunningTime="2025-12-03 08:26:30.87273011 +0000 UTC m=+6075.677628444" Dec 03 08:26:33 crc kubenswrapper[4475]: I1203 08:26:33.877540 4475 generic.go:334] "Generic (PLEG): container finished" podID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerID="510faa6da5efc0c1d6a522cc1828ae2e805e6cb679b80b1948656ce2baf9328f" exitCode=0 Dec 03 08:26:33 crc kubenswrapper[4475]: I1203 08:26:33.877704 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgq5f" event={"ID":"044b0d1b-a43f-477d-b7c9-a229a91712ac","Type":"ContainerDied","Data":"510faa6da5efc0c1d6a522cc1828ae2e805e6cb679b80b1948656ce2baf9328f"} Dec 03 08:26:34 crc kubenswrapper[4475]: I1203 08:26:34.892151 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgq5f" event={"ID":"044b0d1b-a43f-477d-b7c9-a229a91712ac","Type":"ContainerStarted","Data":"8b13e025852ea1f3d4562fee0817f8707c2ff8c56da79c24e28823b7f419174d"} Dec 03 08:26:34 crc kubenswrapper[4475]: I1203 08:26:34.908589 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wgq5f" podStartSLOduration=2.326411528 podStartE2EDuration="6.908571335s" podCreationTimestamp="2025-12-03 08:26:28 +0000 UTC" firstStartedPulling="2025-12-03 08:26:29.814034228 +0000 UTC m=+6074.618932562" lastFinishedPulling="2025-12-03 08:26:34.396194035 +0000 UTC m=+6079.201092369" observedRunningTime="2025-12-03 08:26:34.907572347 +0000 UTC m=+6079.712470681" watchObservedRunningTime="2025-12-03 08:26:34.908571335 +0000 UTC m=+6079.713469670" Dec 03 08:26:35 crc kubenswrapper[4475]: I1203 08:26:35.940364 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:35 crc kubenswrapper[4475]: I1203 08:26:35.940600 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:36 crc kubenswrapper[4475]: I1203 08:26:36.980155 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-b6926" podUID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerName="registry-server" probeResult="failure" output=< Dec 03 08:26:36 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 08:26:36 crc kubenswrapper[4475]: > Dec 03 08:26:37 crc kubenswrapper[4475]: I1203 08:26:37.492389 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:26:37 crc kubenswrapper[4475]: E1203 08:26:37.492849 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:26:38 crc kubenswrapper[4475]: I1203 08:26:38.763051 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:38 crc kubenswrapper[4475]: I1203 08:26:38.763368 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:39 crc kubenswrapper[4475]: I1203 08:26:39.799707 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wgq5f" podUID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerName="registry-server" probeResult="failure" output=< Dec 03 08:26:39 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 08:26:39 crc kubenswrapper[4475]: > Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.005188 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v6ts9"] Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.018518 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.050690 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6ts9"] Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.155848 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-utilities\") pod \"community-operators-v6ts9\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.156186 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-catalog-content\") pod \"community-operators-v6ts9\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.156476 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbcgv\" (UniqueName: \"kubernetes.io/projected/c42070f7-24a1-4368-86a5-a2916f36fad7-kube-api-access-lbcgv\") pod \"community-operators-v6ts9\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.258339 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-catalog-content\") pod \"community-operators-v6ts9\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.258440 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbcgv\" (UniqueName: \"kubernetes.io/projected/c42070f7-24a1-4368-86a5-a2916f36fad7-kube-api-access-lbcgv\") pod \"community-operators-v6ts9\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.258562 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-utilities\") pod \"community-operators-v6ts9\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.260193 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-catalog-content\") pod \"community-operators-v6ts9\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.260488 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-utilities\") pod \"community-operators-v6ts9\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.291689 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbcgv\" (UniqueName: \"kubernetes.io/projected/c42070f7-24a1-4368-86a5-a2916f36fad7-kube-api-access-lbcgv\") pod \"community-operators-v6ts9\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:45 crc kubenswrapper[4475]: I1203 08:26:45.351777 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:46 crc kubenswrapper[4475]: I1203 08:26:46.075904 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:46 crc kubenswrapper[4475]: I1203 08:26:46.131374 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:46 crc kubenswrapper[4475]: I1203 08:26:46.333096 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6ts9"] Dec 03 08:26:46 crc kubenswrapper[4475]: W1203 08:26:46.350720 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc42070f7_24a1_4368_86a5_a2916f36fad7.slice/crio-445360d90d05f7968d063e07cc25661d21f4864a39574214c486aa6f6eda61cf WatchSource:0}: Error finding container 445360d90d05f7968d063e07cc25661d21f4864a39574214c486aa6f6eda61cf: Status 404 returned error can't find the container with id 445360d90d05f7968d063e07cc25661d21f4864a39574214c486aa6f6eda61cf Dec 03 08:26:47 crc kubenswrapper[4475]: I1203 08:26:47.008150 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6ts9" event={"ID":"c42070f7-24a1-4368-86a5-a2916f36fad7","Type":"ContainerDied","Data":"1c37d92430c112d3c0cbe6c9c911c5e2822056ea38d36949e452f74b38521676"} Dec 03 08:26:47 crc kubenswrapper[4475]: I1203 08:26:47.008717 4475 generic.go:334] "Generic (PLEG): container finished" podID="c42070f7-24a1-4368-86a5-a2916f36fad7" containerID="1c37d92430c112d3c0cbe6c9c911c5e2822056ea38d36949e452f74b38521676" exitCode=0 Dec 03 08:26:47 crc kubenswrapper[4475]: I1203 08:26:47.008830 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6ts9" event={"ID":"c42070f7-24a1-4368-86a5-a2916f36fad7","Type":"ContainerStarted","Data":"445360d90d05f7968d063e07cc25661d21f4864a39574214c486aa6f6eda61cf"} Dec 03 08:26:47 crc kubenswrapper[4475]: I1203 08:26:47.384307 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6926"] Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.023885 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6ts9" event={"ID":"c42070f7-24a1-4368-86a5-a2916f36fad7","Type":"ContainerStarted","Data":"0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7"} Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.024851 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b6926" podUID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerName="registry-server" containerID="cri-o://934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da" gracePeriod=2 Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.492849 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:26:48 crc kubenswrapper[4475]: E1203 08:26:48.495799 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.617117 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.651252 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-utilities\") pod \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.651481 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wft9m\" (UniqueName: \"kubernetes.io/projected/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-kube-api-access-wft9m\") pod \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.651529 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-catalog-content\") pod \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\" (UID: \"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe\") " Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.654089 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-utilities" (OuterVolumeSpecName: "utilities") pod "ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" (UID: "ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.660165 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-kube-api-access-wft9m" (OuterVolumeSpecName: "kube-api-access-wft9m") pod "ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" (UID: "ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe"). InnerVolumeSpecName "kube-api-access-wft9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.707984 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" (UID: "ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.754610 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.754647 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:26:48 crc kubenswrapper[4475]: I1203 08:26:48.754658 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wft9m\" (UniqueName: \"kubernetes.io/projected/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe-kube-api-access-wft9m\") on node \"crc\" DevicePath \"\"" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.063204 4475 generic.go:334] "Generic (PLEG): container finished" podID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerID="934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da" exitCode=0 Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.063288 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6926" event={"ID":"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe","Type":"ContainerDied","Data":"934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da"} Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.063326 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6926" event={"ID":"ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe","Type":"ContainerDied","Data":"176d2eaab23e27381bbcf3a12f390f15564c1e3148f0ca4b1b81d60f2575f31b"} Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.063345 4475 scope.go:117] "RemoveContainer" containerID="934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.063532 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6926" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.092110 4475 generic.go:334] "Generic (PLEG): container finished" podID="c42070f7-24a1-4368-86a5-a2916f36fad7" containerID="0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7" exitCode=0 Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.092324 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6ts9" event={"ID":"c42070f7-24a1-4368-86a5-a2916f36fad7","Type":"ContainerDied","Data":"0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7"} Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.119104 4475 scope.go:117] "RemoveContainer" containerID="51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.175038 4475 scope.go:117] "RemoveContainer" containerID="7572f88b0f9682923a4b976dfc8959b766f4981d33f75df8126945e272e1f098" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.203736 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6926"] Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.210196 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b6926"] Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.232493 4475 scope.go:117] "RemoveContainer" containerID="934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da" Dec 03 08:26:49 crc kubenswrapper[4475]: E1203 08:26:49.248506 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da\": container with ID starting with 934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da not found: ID does not exist" containerID="934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.249559 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da"} err="failed to get container status \"934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da\": rpc error: code = NotFound desc = could not find container \"934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da\": container with ID starting with 934c3d7e878cdf4e0b64667f994fd0080ed48c5a7d8bad5d617bb2ab92ea34da not found: ID does not exist" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.249612 4475 scope.go:117] "RemoveContainer" containerID="51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c" Dec 03 08:26:49 crc kubenswrapper[4475]: E1203 08:26:49.250781 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c\": container with ID starting with 51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c not found: ID does not exist" containerID="51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.250856 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c"} err="failed to get container status \"51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c\": rpc error: code = NotFound desc = could not find container \"51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c\": container with ID starting with 51addcfe215d88eb4044b8017a09c3695adda555fabbee1410f79893c942a84c not found: ID does not exist" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.250889 4475 scope.go:117] "RemoveContainer" containerID="7572f88b0f9682923a4b976dfc8959b766f4981d33f75df8126945e272e1f098" Dec 03 08:26:49 crc kubenswrapper[4475]: E1203 08:26:49.251234 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7572f88b0f9682923a4b976dfc8959b766f4981d33f75df8126945e272e1f098\": container with ID starting with 7572f88b0f9682923a4b976dfc8959b766f4981d33f75df8126945e272e1f098 not found: ID does not exist" containerID="7572f88b0f9682923a4b976dfc8959b766f4981d33f75df8126945e272e1f098" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.251276 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7572f88b0f9682923a4b976dfc8959b766f4981d33f75df8126945e272e1f098"} err="failed to get container status \"7572f88b0f9682923a4b976dfc8959b766f4981d33f75df8126945e272e1f098\": rpc error: code = NotFound desc = could not find container \"7572f88b0f9682923a4b976dfc8959b766f4981d33f75df8126945e272e1f098\": container with ID starting with 7572f88b0f9682923a4b976dfc8959b766f4981d33f75df8126945e272e1f098 not found: ID does not exist" Dec 03 08:26:49 crc kubenswrapper[4475]: E1203 08:26:49.383787 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad7e6cbb_5c4f_4b23_8649_5b240b7b54fe.slice\": RecentStats: unable to find data in memory cache]" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.502715 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" path="/var/lib/kubelet/pods/ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe/volumes" Dec 03 08:26:49 crc kubenswrapper[4475]: I1203 08:26:49.809763 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wgq5f" podUID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerName="registry-server" probeResult="failure" output=< Dec 03 08:26:49 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 08:26:49 crc kubenswrapper[4475]: > Dec 03 08:26:50 crc kubenswrapper[4475]: I1203 08:26:50.124276 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6ts9" event={"ID":"c42070f7-24a1-4368-86a5-a2916f36fad7","Type":"ContainerStarted","Data":"075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1"} Dec 03 08:26:55 crc kubenswrapper[4475]: I1203 08:26:55.352543 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:55 crc kubenswrapper[4475]: I1203 08:26:55.353248 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:55 crc kubenswrapper[4475]: I1203 08:26:55.395072 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:55 crc kubenswrapper[4475]: I1203 08:26:55.416283 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v6ts9" podStartSLOduration=8.770317514 podStartE2EDuration="11.416261196s" podCreationTimestamp="2025-12-03 08:26:44 +0000 UTC" firstStartedPulling="2025-12-03 08:26:47.01191834 +0000 UTC m=+6091.816816674" lastFinishedPulling="2025-12-03 08:26:49.657862022 +0000 UTC m=+6094.462760356" observedRunningTime="2025-12-03 08:26:50.146959295 +0000 UTC m=+6094.951857618" watchObservedRunningTime="2025-12-03 08:26:55.416261196 +0000 UTC m=+6100.221159529" Dec 03 08:26:56 crc kubenswrapper[4475]: I1203 08:26:56.224933 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:57 crc kubenswrapper[4475]: I1203 08:26:57.638591 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v6ts9"] Dec 03 08:26:58 crc kubenswrapper[4475]: I1203 08:26:58.216222 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v6ts9" podUID="c42070f7-24a1-4368-86a5-a2916f36fad7" containerName="registry-server" containerID="cri-o://075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1" gracePeriod=2 Dec 03 08:26:58 crc kubenswrapper[4475]: I1203 08:26:58.721597 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:58 crc kubenswrapper[4475]: I1203 08:26:58.807245 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:58 crc kubenswrapper[4475]: I1203 08:26:58.846730 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:26:58 crc kubenswrapper[4475]: I1203 08:26:58.907277 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-utilities\") pod \"c42070f7-24a1-4368-86a5-a2916f36fad7\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " Dec 03 08:26:58 crc kubenswrapper[4475]: I1203 08:26:58.907382 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-catalog-content\") pod \"c42070f7-24a1-4368-86a5-a2916f36fad7\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " Dec 03 08:26:58 crc kubenswrapper[4475]: I1203 08:26:58.907442 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbcgv\" (UniqueName: \"kubernetes.io/projected/c42070f7-24a1-4368-86a5-a2916f36fad7-kube-api-access-lbcgv\") pod \"c42070f7-24a1-4368-86a5-a2916f36fad7\" (UID: \"c42070f7-24a1-4368-86a5-a2916f36fad7\") " Dec 03 08:26:58 crc kubenswrapper[4475]: I1203 08:26:58.908131 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-utilities" (OuterVolumeSpecName: "utilities") pod "c42070f7-24a1-4368-86a5-a2916f36fad7" (UID: "c42070f7-24a1-4368-86a5-a2916f36fad7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:26:58 crc kubenswrapper[4475]: I1203 08:26:58.908408 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:26:58 crc kubenswrapper[4475]: I1203 08:26:58.920581 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c42070f7-24a1-4368-86a5-a2916f36fad7-kube-api-access-lbcgv" (OuterVolumeSpecName: "kube-api-access-lbcgv") pod "c42070f7-24a1-4368-86a5-a2916f36fad7" (UID: "c42070f7-24a1-4368-86a5-a2916f36fad7"). InnerVolumeSpecName "kube-api-access-lbcgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:26:58 crc kubenswrapper[4475]: I1203 08:26:58.946618 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c42070f7-24a1-4368-86a5-a2916f36fad7" (UID: "c42070f7-24a1-4368-86a5-a2916f36fad7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.011089 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c42070f7-24a1-4368-86a5-a2916f36fad7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.011121 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbcgv\" (UniqueName: \"kubernetes.io/projected/c42070f7-24a1-4368-86a5-a2916f36fad7-kube-api-access-lbcgv\") on node \"crc\" DevicePath \"\"" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.230194 4475 generic.go:334] "Generic (PLEG): container finished" podID="c42070f7-24a1-4368-86a5-a2916f36fad7" containerID="075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1" exitCode=0 Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.230299 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6ts9" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.230413 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6ts9" event={"ID":"c42070f7-24a1-4368-86a5-a2916f36fad7","Type":"ContainerDied","Data":"075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1"} Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.230472 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6ts9" event={"ID":"c42070f7-24a1-4368-86a5-a2916f36fad7","Type":"ContainerDied","Data":"445360d90d05f7968d063e07cc25661d21f4864a39574214c486aa6f6eda61cf"} Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.230500 4475 scope.go:117] "RemoveContainer" containerID="075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.269173 4475 scope.go:117] "RemoveContainer" containerID="0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.275256 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v6ts9"] Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.285212 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v6ts9"] Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.289190 4475 scope.go:117] "RemoveContainer" containerID="1c37d92430c112d3c0cbe6c9c911c5e2822056ea38d36949e452f74b38521676" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.326606 4475 scope.go:117] "RemoveContainer" containerID="075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1" Dec 03 08:26:59 crc kubenswrapper[4475]: E1203 08:26:59.327417 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1\": container with ID starting with 075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1 not found: ID does not exist" containerID="075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.327447 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1"} err="failed to get container status \"075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1\": rpc error: code = NotFound desc = could not find container \"075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1\": container with ID starting with 075a001d49e8f1574894217fed991605bec45a8a970620920da278ae2b9de6f1 not found: ID does not exist" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.327482 4475 scope.go:117] "RemoveContainer" containerID="0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7" Dec 03 08:26:59 crc kubenswrapper[4475]: E1203 08:26:59.327801 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7\": container with ID starting with 0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7 not found: ID does not exist" containerID="0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.327823 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7"} err="failed to get container status \"0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7\": rpc error: code = NotFound desc = could not find container \"0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7\": container with ID starting with 0d8fcc8699723537068665eac5a0fe54fbb180d2665dc4923b0363d112bbb1c7 not found: ID does not exist" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.327836 4475 scope.go:117] "RemoveContainer" containerID="1c37d92430c112d3c0cbe6c9c911c5e2822056ea38d36949e452f74b38521676" Dec 03 08:26:59 crc kubenswrapper[4475]: E1203 08:26:59.328099 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c37d92430c112d3c0cbe6c9c911c5e2822056ea38d36949e452f74b38521676\": container with ID starting with 1c37d92430c112d3c0cbe6c9c911c5e2822056ea38d36949e452f74b38521676 not found: ID does not exist" containerID="1c37d92430c112d3c0cbe6c9c911c5e2822056ea38d36949e452f74b38521676" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.328124 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c37d92430c112d3c0cbe6c9c911c5e2822056ea38d36949e452f74b38521676"} err="failed to get container status \"1c37d92430c112d3c0cbe6c9c911c5e2822056ea38d36949e452f74b38521676\": rpc error: code = NotFound desc = could not find container \"1c37d92430c112d3c0cbe6c9c911c5e2822056ea38d36949e452f74b38521676\": container with ID starting with 1c37d92430c112d3c0cbe6c9c911c5e2822056ea38d36949e452f74b38521676 not found: ID does not exist" Dec 03 08:26:59 crc kubenswrapper[4475]: I1203 08:26:59.507306 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c42070f7-24a1-4368-86a5-a2916f36fad7" path="/var/lib/kubelet/pods/c42070f7-24a1-4368-86a5-a2916f36fad7/volumes" Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.031372 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgq5f"] Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.031846 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wgq5f" podUID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerName="registry-server" containerID="cri-o://8b13e025852ea1f3d4562fee0817f8707c2ff8c56da79c24e28823b7f419174d" gracePeriod=2 Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.260773 4475 generic.go:334] "Generic (PLEG): container finished" podID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerID="8b13e025852ea1f3d4562fee0817f8707c2ff8c56da79c24e28823b7f419174d" exitCode=0 Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.260818 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgq5f" event={"ID":"044b0d1b-a43f-477d-b7c9-a229a91712ac","Type":"ContainerDied","Data":"8b13e025852ea1f3d4562fee0817f8707c2ff8c56da79c24e28823b7f419174d"} Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.491788 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:27:01 crc kubenswrapper[4475]: E1203 08:27:01.492692 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.504611 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.577789 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m56nw\" (UniqueName: \"kubernetes.io/projected/044b0d1b-a43f-477d-b7c9-a229a91712ac-kube-api-access-m56nw\") pod \"044b0d1b-a43f-477d-b7c9-a229a91712ac\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.577856 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-utilities\") pod \"044b0d1b-a43f-477d-b7c9-a229a91712ac\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.578266 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-catalog-content\") pod \"044b0d1b-a43f-477d-b7c9-a229a91712ac\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.579131 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-utilities" (OuterVolumeSpecName: "utilities") pod "044b0d1b-a43f-477d-b7c9-a229a91712ac" (UID: "044b0d1b-a43f-477d-b7c9-a229a91712ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.580025 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.589881 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/044b0d1b-a43f-477d-b7c9-a229a91712ac-kube-api-access-m56nw" (OuterVolumeSpecName: "kube-api-access-m56nw") pod "044b0d1b-a43f-477d-b7c9-a229a91712ac" (UID: "044b0d1b-a43f-477d-b7c9-a229a91712ac"). InnerVolumeSpecName "kube-api-access-m56nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.681485 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "044b0d1b-a43f-477d-b7c9-a229a91712ac" (UID: "044b0d1b-a43f-477d-b7c9-a229a91712ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.681734 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-catalog-content\") pod \"044b0d1b-a43f-477d-b7c9-a229a91712ac\" (UID: \"044b0d1b-a43f-477d-b7c9-a229a91712ac\") " Dec 03 08:27:01 crc kubenswrapper[4475]: W1203 08:27:01.682419 4475 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/044b0d1b-a43f-477d-b7c9-a229a91712ac/volumes/kubernetes.io~empty-dir/catalog-content Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.682520 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "044b0d1b-a43f-477d-b7c9-a229a91712ac" (UID: "044b0d1b-a43f-477d-b7c9-a229a91712ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.682562 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m56nw\" (UniqueName: \"kubernetes.io/projected/044b0d1b-a43f-477d-b7c9-a229a91712ac-kube-api-access-m56nw\") on node \"crc\" DevicePath \"\"" Dec 03 08:27:01 crc kubenswrapper[4475]: I1203 08:27:01.784361 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/044b0d1b-a43f-477d-b7c9-a229a91712ac-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:27:02 crc kubenswrapper[4475]: I1203 08:27:02.274618 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgq5f" event={"ID":"044b0d1b-a43f-477d-b7c9-a229a91712ac","Type":"ContainerDied","Data":"58002f14c2ef7ed28c1db19f179585a54d0d203dd81997a870dab8a8e1709426"} Dec 03 08:27:02 crc kubenswrapper[4475]: I1203 08:27:02.275256 4475 scope.go:117] "RemoveContainer" containerID="8b13e025852ea1f3d4562fee0817f8707c2ff8c56da79c24e28823b7f419174d" Dec 03 08:27:02 crc kubenswrapper[4475]: I1203 08:27:02.274998 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgq5f" Dec 03 08:27:02 crc kubenswrapper[4475]: I1203 08:27:02.304885 4475 scope.go:117] "RemoveContainer" containerID="510faa6da5efc0c1d6a522cc1828ae2e805e6cb679b80b1948656ce2baf9328f" Dec 03 08:27:02 crc kubenswrapper[4475]: I1203 08:27:02.313040 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgq5f"] Dec 03 08:27:02 crc kubenswrapper[4475]: I1203 08:27:02.322185 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wgq5f"] Dec 03 08:27:02 crc kubenswrapper[4475]: I1203 08:27:02.326064 4475 scope.go:117] "RemoveContainer" containerID="dbaeebfd2fcd25185b57bcb498e0103f1c1f6e5ae78bde3ab5a9f0bc1ae9aac9" Dec 03 08:27:03 crc kubenswrapper[4475]: I1203 08:27:03.502901 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="044b0d1b-a43f-477d-b7c9-a229a91712ac" path="/var/lib/kubelet/pods/044b0d1b-a43f-477d-b7c9-a229a91712ac/volumes" Dec 03 08:27:13 crc kubenswrapper[4475]: I1203 08:27:13.492230 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:27:13 crc kubenswrapper[4475]: E1203 08:27:13.493248 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:27:27 crc kubenswrapper[4475]: I1203 08:27:27.492429 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:27:27 crc kubenswrapper[4475]: E1203 08:27:27.494992 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:27:39 crc kubenswrapper[4475]: I1203 08:27:39.491897 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:27:39 crc kubenswrapper[4475]: E1203 08:27:39.492781 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:27:52 crc kubenswrapper[4475]: I1203 08:27:52.491641 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:27:52 crc kubenswrapper[4475]: E1203 08:27:52.492635 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:28:05 crc kubenswrapper[4475]: I1203 08:28:05.499215 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:28:05 crc kubenswrapper[4475]: E1203 08:28:05.502407 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:28:18 crc kubenswrapper[4475]: I1203 08:28:18.491655 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:28:18 crc kubenswrapper[4475]: E1203 08:28:18.492512 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:28:32 crc kubenswrapper[4475]: I1203 08:28:32.491646 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:28:32 crc kubenswrapper[4475]: E1203 08:28:32.492399 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:28:43 crc kubenswrapper[4475]: I1203 08:28:43.491655 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:28:43 crc kubenswrapper[4475]: E1203 08:28:43.492567 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:28:56 crc kubenswrapper[4475]: I1203 08:28:56.492235 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:28:56 crc kubenswrapper[4475]: E1203 08:28:56.493244 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:29:08 crc kubenswrapper[4475]: I1203 08:29:08.491855 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:29:08 crc kubenswrapper[4475]: E1203 08:29:08.492539 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:29:19 crc kubenswrapper[4475]: I1203 08:29:19.491284 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:29:19 crc kubenswrapper[4475]: E1203 08:29:19.491870 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:29:33 crc kubenswrapper[4475]: I1203 08:29:33.492665 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:29:33 crc kubenswrapper[4475]: E1203 08:29:33.493592 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:29:46 crc kubenswrapper[4475]: I1203 08:29:46.491050 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:29:46 crc kubenswrapper[4475]: E1203 08:29:46.491823 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:29:59 crc kubenswrapper[4475]: I1203 08:29:59.490951 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:29:59 crc kubenswrapper[4475]: E1203 08:29:59.491584 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.144058 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8"] Dec 03 08:30:00 crc kubenswrapper[4475]: E1203 08:30:00.144656 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerName="extract-content" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.144684 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerName="extract-content" Dec 03 08:30:00 crc kubenswrapper[4475]: E1203 08:30:00.144694 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerName="extract-content" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.144699 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerName="extract-content" Dec 03 08:30:00 crc kubenswrapper[4475]: E1203 08:30:00.144719 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c42070f7-24a1-4368-86a5-a2916f36fad7" containerName="extract-utilities" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.144725 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c42070f7-24a1-4368-86a5-a2916f36fad7" containerName="extract-utilities" Dec 03 08:30:00 crc kubenswrapper[4475]: E1203 08:30:00.144742 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerName="extract-utilities" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.144748 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerName="extract-utilities" Dec 03 08:30:00 crc kubenswrapper[4475]: E1203 08:30:00.144754 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c42070f7-24a1-4368-86a5-a2916f36fad7" containerName="extract-content" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.144759 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c42070f7-24a1-4368-86a5-a2916f36fad7" containerName="extract-content" Dec 03 08:30:00 crc kubenswrapper[4475]: E1203 08:30:00.144773 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerName="registry-server" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.144778 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerName="registry-server" Dec 03 08:30:00 crc kubenswrapper[4475]: E1203 08:30:00.144790 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerName="extract-utilities" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.144795 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerName="extract-utilities" Dec 03 08:30:00 crc kubenswrapper[4475]: E1203 08:30:00.144804 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c42070f7-24a1-4368-86a5-a2916f36fad7" containerName="registry-server" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.144810 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c42070f7-24a1-4368-86a5-a2916f36fad7" containerName="registry-server" Dec 03 08:30:00 crc kubenswrapper[4475]: E1203 08:30:00.144817 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerName="registry-server" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.144822 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerName="registry-server" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.145018 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="044b0d1b-a43f-477d-b7c9-a229a91712ac" containerName="registry-server" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.145045 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="c42070f7-24a1-4368-86a5-a2916f36fad7" containerName="registry-server" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.145055 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad7e6cbb-5c4f-4b23-8649-5b240b7b54fe" containerName="registry-server" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.145642 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.150994 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8"] Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.152442 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/102ffa42-f64f-4310-aa9d-a379c7b239c3-secret-volume\") pod \"collect-profiles-29412510-5blq8\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.152566 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g5nl\" (UniqueName: \"kubernetes.io/projected/102ffa42-f64f-4310-aa9d-a379c7b239c3-kube-api-access-2g5nl\") pod \"collect-profiles-29412510-5blq8\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.152599 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/102ffa42-f64f-4310-aa9d-a379c7b239c3-config-volume\") pod \"collect-profiles-29412510-5blq8\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.152661 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.152667 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.254301 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/102ffa42-f64f-4310-aa9d-a379c7b239c3-secret-volume\") pod \"collect-profiles-29412510-5blq8\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.254431 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g5nl\" (UniqueName: \"kubernetes.io/projected/102ffa42-f64f-4310-aa9d-a379c7b239c3-kube-api-access-2g5nl\") pod \"collect-profiles-29412510-5blq8\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.254477 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/102ffa42-f64f-4310-aa9d-a379c7b239c3-config-volume\") pod \"collect-profiles-29412510-5blq8\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.255200 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/102ffa42-f64f-4310-aa9d-a379c7b239c3-config-volume\") pod \"collect-profiles-29412510-5blq8\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.259295 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/102ffa42-f64f-4310-aa9d-a379c7b239c3-secret-volume\") pod \"collect-profiles-29412510-5blq8\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.268833 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g5nl\" (UniqueName: \"kubernetes.io/projected/102ffa42-f64f-4310-aa9d-a379c7b239c3-kube-api-access-2g5nl\") pod \"collect-profiles-29412510-5blq8\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.461586 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:00 crc kubenswrapper[4475]: I1203 08:30:00.882386 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8"] Dec 03 08:30:01 crc kubenswrapper[4475]: I1203 08:30:01.715206 4475 generic.go:334] "Generic (PLEG): container finished" podID="102ffa42-f64f-4310-aa9d-a379c7b239c3" containerID="5cc0f044e8bbf5a9cb4daadb48db50b5864597342a0761b9de48cd66e5a39920" exitCode=0 Dec 03 08:30:01 crc kubenswrapper[4475]: I1203 08:30:01.715298 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" event={"ID":"102ffa42-f64f-4310-aa9d-a379c7b239c3","Type":"ContainerDied","Data":"5cc0f044e8bbf5a9cb4daadb48db50b5864597342a0761b9de48cd66e5a39920"} Dec 03 08:30:01 crc kubenswrapper[4475]: I1203 08:30:01.715607 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" event={"ID":"102ffa42-f64f-4310-aa9d-a379c7b239c3","Type":"ContainerStarted","Data":"a25b7e4f42f92e7d32e28327129a187bfba368cd8de8e97eda8dbc7623165ce0"} Dec 03 08:30:02 crc kubenswrapper[4475]: I1203 08:30:02.991948 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:02 crc kubenswrapper[4475]: I1203 08:30:02.995089 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g5nl\" (UniqueName: \"kubernetes.io/projected/102ffa42-f64f-4310-aa9d-a379c7b239c3-kube-api-access-2g5nl\") pod \"102ffa42-f64f-4310-aa9d-a379c7b239c3\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " Dec 03 08:30:02 crc kubenswrapper[4475]: I1203 08:30:02.995159 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/102ffa42-f64f-4310-aa9d-a379c7b239c3-secret-volume\") pod \"102ffa42-f64f-4310-aa9d-a379c7b239c3\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " Dec 03 08:30:02 crc kubenswrapper[4475]: I1203 08:30:02.995211 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/102ffa42-f64f-4310-aa9d-a379c7b239c3-config-volume\") pod \"102ffa42-f64f-4310-aa9d-a379c7b239c3\" (UID: \"102ffa42-f64f-4310-aa9d-a379c7b239c3\") " Dec 03 08:30:02 crc kubenswrapper[4475]: I1203 08:30:02.995709 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102ffa42-f64f-4310-aa9d-a379c7b239c3-config-volume" (OuterVolumeSpecName: "config-volume") pod "102ffa42-f64f-4310-aa9d-a379c7b239c3" (UID: "102ffa42-f64f-4310-aa9d-a379c7b239c3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:30:03 crc kubenswrapper[4475]: I1203 08:30:03.000197 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/102ffa42-f64f-4310-aa9d-a379c7b239c3-kube-api-access-2g5nl" (OuterVolumeSpecName: "kube-api-access-2g5nl") pod "102ffa42-f64f-4310-aa9d-a379c7b239c3" (UID: "102ffa42-f64f-4310-aa9d-a379c7b239c3"). InnerVolumeSpecName "kube-api-access-2g5nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:30:03 crc kubenswrapper[4475]: I1203 08:30:03.000346 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/102ffa42-f64f-4310-aa9d-a379c7b239c3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "102ffa42-f64f-4310-aa9d-a379c7b239c3" (UID: "102ffa42-f64f-4310-aa9d-a379c7b239c3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:30:03 crc kubenswrapper[4475]: I1203 08:30:03.096781 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g5nl\" (UniqueName: \"kubernetes.io/projected/102ffa42-f64f-4310-aa9d-a379c7b239c3-kube-api-access-2g5nl\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:03 crc kubenswrapper[4475]: I1203 08:30:03.096984 4475 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/102ffa42-f64f-4310-aa9d-a379c7b239c3-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:03 crc kubenswrapper[4475]: I1203 08:30:03.096993 4475 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/102ffa42-f64f-4310-aa9d-a379c7b239c3-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:03 crc kubenswrapper[4475]: E1203 08:30:03.576607 4475 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod102ffa42_f64f_4310_aa9d_a379c7b239c3.slice\": RecentStats: unable to find data in memory cache]" Dec 03 08:30:03 crc kubenswrapper[4475]: I1203 08:30:03.729146 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" event={"ID":"102ffa42-f64f-4310-aa9d-a379c7b239c3","Type":"ContainerDied","Data":"a25b7e4f42f92e7d32e28327129a187bfba368cd8de8e97eda8dbc7623165ce0"} Dec 03 08:30:03 crc kubenswrapper[4475]: I1203 08:30:03.729180 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a25b7e4f42f92e7d32e28327129a187bfba368cd8de8e97eda8dbc7623165ce0" Dec 03 08:30:03 crc kubenswrapper[4475]: I1203 08:30:03.729197 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-5blq8" Dec 03 08:30:04 crc kubenswrapper[4475]: I1203 08:30:04.066925 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9"] Dec 03 08:30:04 crc kubenswrapper[4475]: I1203 08:30:04.073950 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-q87w9"] Dec 03 08:30:05 crc kubenswrapper[4475]: I1203 08:30:05.501617 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0de99f57-7c36-4c39-8f80-20bd559c9757" path="/var/lib/kubelet/pods/0de99f57-7c36-4c39-8f80-20bd559c9757/volumes" Dec 03 08:30:11 crc kubenswrapper[4475]: I1203 08:30:11.492685 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:30:11 crc kubenswrapper[4475]: E1203 08:30:11.493551 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:30:26 crc kubenswrapper[4475]: I1203 08:30:26.492066 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:30:26 crc kubenswrapper[4475]: E1203 08:30:26.493034 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:30:30 crc kubenswrapper[4475]: I1203 08:30:30.736972 4475 scope.go:117] "RemoveContainer" containerID="536148c65f7868734a70346ed943fb935f828740cd0ddff963a8d1b3fae66624" Dec 03 08:30:41 crc kubenswrapper[4475]: I1203 08:30:41.491895 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:30:42 crc kubenswrapper[4475]: I1203 08:30:42.085313 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"5e4bdfeb75b6a6f33140859fefdf48e5e0055c78f6936aca52f1be6e00372076"} Dec 03 08:32:58 crc kubenswrapper[4475]: I1203 08:32:58.933085 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:32:58 crc kubenswrapper[4475]: I1203 08:32:58.934047 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:33:28 crc kubenswrapper[4475]: I1203 08:33:28.933279 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:33:28 crc kubenswrapper[4475]: I1203 08:33:28.933967 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.000005 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bs85v"] Dec 03 08:33:31 crc kubenswrapper[4475]: E1203 08:33:31.001376 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102ffa42-f64f-4310-aa9d-a379c7b239c3" containerName="collect-profiles" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.001690 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="102ffa42-f64f-4310-aa9d-a379c7b239c3" containerName="collect-profiles" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.002233 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="102ffa42-f64f-4310-aa9d-a379c7b239c3" containerName="collect-profiles" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.005735 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.012682 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bs85v"] Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.106116 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkhqp\" (UniqueName: \"kubernetes.io/projected/94deb36e-cfba-4666-858e-1bb5cb928980-kube-api-access-dkhqp\") pod \"redhat-marketplace-bs85v\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.106250 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-catalog-content\") pod \"redhat-marketplace-bs85v\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.106342 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-utilities\") pod \"redhat-marketplace-bs85v\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.208719 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkhqp\" (UniqueName: \"kubernetes.io/projected/94deb36e-cfba-4666-858e-1bb5cb928980-kube-api-access-dkhqp\") pod \"redhat-marketplace-bs85v\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.208775 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-catalog-content\") pod \"redhat-marketplace-bs85v\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.208829 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-utilities\") pod \"redhat-marketplace-bs85v\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.209372 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-utilities\") pod \"redhat-marketplace-bs85v\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.209432 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-catalog-content\") pod \"redhat-marketplace-bs85v\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.227306 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkhqp\" (UniqueName: \"kubernetes.io/projected/94deb36e-cfba-4666-858e-1bb5cb928980-kube-api-access-dkhqp\") pod \"redhat-marketplace-bs85v\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.333133 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:31 crc kubenswrapper[4475]: I1203 08:33:31.787183 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bs85v"] Dec 03 08:33:32 crc kubenswrapper[4475]: I1203 08:33:32.618908 4475 generic.go:334] "Generic (PLEG): container finished" podID="94deb36e-cfba-4666-858e-1bb5cb928980" containerID="b52dd364c744398e187d168756c55c52ffbf4a0f2b2391787f510b73e957efd5" exitCode=0 Dec 03 08:33:32 crc kubenswrapper[4475]: I1203 08:33:32.618971 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bs85v" event={"ID":"94deb36e-cfba-4666-858e-1bb5cb928980","Type":"ContainerDied","Data":"b52dd364c744398e187d168756c55c52ffbf4a0f2b2391787f510b73e957efd5"} Dec 03 08:33:32 crc kubenswrapper[4475]: I1203 08:33:32.619004 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bs85v" event={"ID":"94deb36e-cfba-4666-858e-1bb5cb928980","Type":"ContainerStarted","Data":"8e3c0d15a4320847c2192aa614f3b0e110947ace00340ab8088fca2841728706"} Dec 03 08:33:32 crc kubenswrapper[4475]: I1203 08:33:32.623442 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:33:33 crc kubenswrapper[4475]: I1203 08:33:33.629981 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bs85v" event={"ID":"94deb36e-cfba-4666-858e-1bb5cb928980","Type":"ContainerStarted","Data":"8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea"} Dec 03 08:33:34 crc kubenswrapper[4475]: I1203 08:33:34.639033 4475 generic.go:334] "Generic (PLEG): container finished" podID="94deb36e-cfba-4666-858e-1bb5cb928980" containerID="8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea" exitCode=0 Dec 03 08:33:34 crc kubenswrapper[4475]: I1203 08:33:34.639139 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bs85v" event={"ID":"94deb36e-cfba-4666-858e-1bb5cb928980","Type":"ContainerDied","Data":"8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea"} Dec 03 08:33:35 crc kubenswrapper[4475]: I1203 08:33:35.648773 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bs85v" event={"ID":"94deb36e-cfba-4666-858e-1bb5cb928980","Type":"ContainerStarted","Data":"8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93"} Dec 03 08:33:35 crc kubenswrapper[4475]: I1203 08:33:35.670198 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bs85v" podStartSLOduration=3.085496231 podStartE2EDuration="5.670181685s" podCreationTimestamp="2025-12-03 08:33:30 +0000 UTC" firstStartedPulling="2025-12-03 08:33:32.623177687 +0000 UTC m=+6497.428076021" lastFinishedPulling="2025-12-03 08:33:35.207863141 +0000 UTC m=+6500.012761475" observedRunningTime="2025-12-03 08:33:35.665371711 +0000 UTC m=+6500.470270046" watchObservedRunningTime="2025-12-03 08:33:35.670181685 +0000 UTC m=+6500.475080020" Dec 03 08:33:41 crc kubenswrapper[4475]: I1203 08:33:41.333778 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:41 crc kubenswrapper[4475]: I1203 08:33:41.334137 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:41 crc kubenswrapper[4475]: I1203 08:33:41.369602 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:41 crc kubenswrapper[4475]: I1203 08:33:41.731130 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:41 crc kubenswrapper[4475]: I1203 08:33:41.770742 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bs85v"] Dec 03 08:33:43 crc kubenswrapper[4475]: I1203 08:33:43.712536 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bs85v" podUID="94deb36e-cfba-4666-858e-1bb5cb928980" containerName="registry-server" containerID="cri-o://8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93" gracePeriod=2 Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.149127 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.344548 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkhqp\" (UniqueName: \"kubernetes.io/projected/94deb36e-cfba-4666-858e-1bb5cb928980-kube-api-access-dkhqp\") pod \"94deb36e-cfba-4666-858e-1bb5cb928980\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.344654 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-utilities\") pod \"94deb36e-cfba-4666-858e-1bb5cb928980\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.344764 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-catalog-content\") pod \"94deb36e-cfba-4666-858e-1bb5cb928980\" (UID: \"94deb36e-cfba-4666-858e-1bb5cb928980\") " Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.345305 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-utilities" (OuterVolumeSpecName: "utilities") pod "94deb36e-cfba-4666-858e-1bb5cb928980" (UID: "94deb36e-cfba-4666-858e-1bb5cb928980"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.350719 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94deb36e-cfba-4666-858e-1bb5cb928980-kube-api-access-dkhqp" (OuterVolumeSpecName: "kube-api-access-dkhqp") pod "94deb36e-cfba-4666-858e-1bb5cb928980" (UID: "94deb36e-cfba-4666-858e-1bb5cb928980"). InnerVolumeSpecName "kube-api-access-dkhqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.361394 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94deb36e-cfba-4666-858e-1bb5cb928980" (UID: "94deb36e-cfba-4666-858e-1bb5cb928980"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.447081 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.447211 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkhqp\" (UniqueName: \"kubernetes.io/projected/94deb36e-cfba-4666-858e-1bb5cb928980-kube-api-access-dkhqp\") on node \"crc\" DevicePath \"\"" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.447269 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94deb36e-cfba-4666-858e-1bb5cb928980-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.723263 4475 generic.go:334] "Generic (PLEG): container finished" podID="94deb36e-cfba-4666-858e-1bb5cb928980" containerID="8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93" exitCode=0 Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.723321 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bs85v" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.723351 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bs85v" event={"ID":"94deb36e-cfba-4666-858e-1bb5cb928980","Type":"ContainerDied","Data":"8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93"} Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.723979 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bs85v" event={"ID":"94deb36e-cfba-4666-858e-1bb5cb928980","Type":"ContainerDied","Data":"8e3c0d15a4320847c2192aa614f3b0e110947ace00340ab8088fca2841728706"} Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.723999 4475 scope.go:117] "RemoveContainer" containerID="8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.748134 4475 scope.go:117] "RemoveContainer" containerID="8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.753696 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bs85v"] Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.759230 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bs85v"] Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.767000 4475 scope.go:117] "RemoveContainer" containerID="b52dd364c744398e187d168756c55c52ffbf4a0f2b2391787f510b73e957efd5" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.805902 4475 scope.go:117] "RemoveContainer" containerID="8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93" Dec 03 08:33:44 crc kubenswrapper[4475]: E1203 08:33:44.806213 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93\": container with ID starting with 8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93 not found: ID does not exist" containerID="8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.806255 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93"} err="failed to get container status \"8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93\": rpc error: code = NotFound desc = could not find container \"8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93\": container with ID starting with 8d0829e7c917e8bad10ec50e846dc513b8ae513865763df85f11b0454e564d93 not found: ID does not exist" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.806281 4475 scope.go:117] "RemoveContainer" containerID="8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea" Dec 03 08:33:44 crc kubenswrapper[4475]: E1203 08:33:44.806772 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea\": container with ID starting with 8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea not found: ID does not exist" containerID="8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.806941 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea"} err="failed to get container status \"8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea\": rpc error: code = NotFound desc = could not find container \"8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea\": container with ID starting with 8f734d6d54f8d2501cb5b49514fe9b8dda0f47ce08c7931d5566f739bcb9daea not found: ID does not exist" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.807023 4475 scope.go:117] "RemoveContainer" containerID="b52dd364c744398e187d168756c55c52ffbf4a0f2b2391787f510b73e957efd5" Dec 03 08:33:44 crc kubenswrapper[4475]: E1203 08:33:44.807550 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b52dd364c744398e187d168756c55c52ffbf4a0f2b2391787f510b73e957efd5\": container with ID starting with b52dd364c744398e187d168756c55c52ffbf4a0f2b2391787f510b73e957efd5 not found: ID does not exist" containerID="b52dd364c744398e187d168756c55c52ffbf4a0f2b2391787f510b73e957efd5" Dec 03 08:33:44 crc kubenswrapper[4475]: I1203 08:33:44.807583 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b52dd364c744398e187d168756c55c52ffbf4a0f2b2391787f510b73e957efd5"} err="failed to get container status \"b52dd364c744398e187d168756c55c52ffbf4a0f2b2391787f510b73e957efd5\": rpc error: code = NotFound desc = could not find container \"b52dd364c744398e187d168756c55c52ffbf4a0f2b2391787f510b73e957efd5\": container with ID starting with b52dd364c744398e187d168756c55c52ffbf4a0f2b2391787f510b73e957efd5 not found: ID does not exist" Dec 03 08:33:45 crc kubenswrapper[4475]: I1203 08:33:45.499546 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94deb36e-cfba-4666-858e-1bb5cb928980" path="/var/lib/kubelet/pods/94deb36e-cfba-4666-858e-1bb5cb928980/volumes" Dec 03 08:33:58 crc kubenswrapper[4475]: I1203 08:33:58.933581 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:33:58 crc kubenswrapper[4475]: I1203 08:33:58.934035 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:33:58 crc kubenswrapper[4475]: I1203 08:33:58.934078 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 08:33:58 crc kubenswrapper[4475]: I1203 08:33:58.934580 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e4bdfeb75b6a6f33140859fefdf48e5e0055c78f6936aca52f1be6e00372076"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:33:58 crc kubenswrapper[4475]: I1203 08:33:58.934631 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://5e4bdfeb75b6a6f33140859fefdf48e5e0055c78f6936aca52f1be6e00372076" gracePeriod=600 Dec 03 08:33:59 crc kubenswrapper[4475]: I1203 08:33:59.837734 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="5e4bdfeb75b6a6f33140859fefdf48e5e0055c78f6936aca52f1be6e00372076" exitCode=0 Dec 03 08:33:59 crc kubenswrapper[4475]: I1203 08:33:59.837802 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"5e4bdfeb75b6a6f33140859fefdf48e5e0055c78f6936aca52f1be6e00372076"} Dec 03 08:33:59 crc kubenswrapper[4475]: I1203 08:33:59.838137 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec"} Dec 03 08:33:59 crc kubenswrapper[4475]: I1203 08:33:59.838158 4475 scope.go:117] "RemoveContainer" containerID="ead2b8ca071f1b77f440239ac4c143069a67fc366c29018f3071d24754d8703b" Dec 03 08:36:28 crc kubenswrapper[4475]: I1203 08:36:28.933218 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:36:28 crc kubenswrapper[4475]: I1203 08:36:28.933654 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.505634 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-58bfk"] Dec 03 08:36:31 crc kubenswrapper[4475]: E1203 08:36:31.506165 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94deb36e-cfba-4666-858e-1bb5cb928980" containerName="extract-content" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.506178 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="94deb36e-cfba-4666-858e-1bb5cb928980" containerName="extract-content" Dec 03 08:36:31 crc kubenswrapper[4475]: E1203 08:36:31.506191 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94deb36e-cfba-4666-858e-1bb5cb928980" containerName="registry-server" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.506197 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="94deb36e-cfba-4666-858e-1bb5cb928980" containerName="registry-server" Dec 03 08:36:31 crc kubenswrapper[4475]: E1203 08:36:31.506224 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94deb36e-cfba-4666-858e-1bb5cb928980" containerName="extract-utilities" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.506230 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="94deb36e-cfba-4666-858e-1bb5cb928980" containerName="extract-utilities" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.506417 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="94deb36e-cfba-4666-858e-1bb5cb928980" containerName="registry-server" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.507675 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.514138 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58bfk"] Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.604306 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-utilities\") pod \"certified-operators-58bfk\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.604393 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7nz8\" (UniqueName: \"kubernetes.io/projected/fa596412-0693-4cfb-baf7-8c8dc0c56616-kube-api-access-z7nz8\") pod \"certified-operators-58bfk\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.604531 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-catalog-content\") pod \"certified-operators-58bfk\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.706197 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-catalog-content\") pod \"certified-operators-58bfk\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.706406 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-utilities\") pod \"certified-operators-58bfk\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.706504 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7nz8\" (UniqueName: \"kubernetes.io/projected/fa596412-0693-4cfb-baf7-8c8dc0c56616-kube-api-access-z7nz8\") pod \"certified-operators-58bfk\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.707879 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-utilities\") pod \"certified-operators-58bfk\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.708487 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-catalog-content\") pod \"certified-operators-58bfk\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.724516 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7nz8\" (UniqueName: \"kubernetes.io/projected/fa596412-0693-4cfb-baf7-8c8dc0c56616-kube-api-access-z7nz8\") pod \"certified-operators-58bfk\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:31 crc kubenswrapper[4475]: I1203 08:36:31.824973 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:32 crc kubenswrapper[4475]: I1203 08:36:32.329523 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58bfk"] Dec 03 08:36:33 crc kubenswrapper[4475]: I1203 08:36:33.108542 4475 generic.go:334] "Generic (PLEG): container finished" podID="fa596412-0693-4cfb-baf7-8c8dc0c56616" containerID="b4f831d1f8ed0d060f5bf101b125b4bd56a264c058eb907d0200eb19d3eb8e3b" exitCode=0 Dec 03 08:36:33 crc kubenswrapper[4475]: I1203 08:36:33.108592 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58bfk" event={"ID":"fa596412-0693-4cfb-baf7-8c8dc0c56616","Type":"ContainerDied","Data":"b4f831d1f8ed0d060f5bf101b125b4bd56a264c058eb907d0200eb19d3eb8e3b"} Dec 03 08:36:33 crc kubenswrapper[4475]: I1203 08:36:33.108759 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58bfk" event={"ID":"fa596412-0693-4cfb-baf7-8c8dc0c56616","Type":"ContainerStarted","Data":"222beaffc7c101dcce33786805cb14a487f4fa643d0eea4a06d7a6be08421997"} Dec 03 08:36:34 crc kubenswrapper[4475]: I1203 08:36:34.117966 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58bfk" event={"ID":"fa596412-0693-4cfb-baf7-8c8dc0c56616","Type":"ContainerStarted","Data":"0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0"} Dec 03 08:36:35 crc kubenswrapper[4475]: I1203 08:36:35.125800 4475 generic.go:334] "Generic (PLEG): container finished" podID="fa596412-0693-4cfb-baf7-8c8dc0c56616" containerID="0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0" exitCode=0 Dec 03 08:36:35 crc kubenswrapper[4475]: I1203 08:36:35.125895 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58bfk" event={"ID":"fa596412-0693-4cfb-baf7-8c8dc0c56616","Type":"ContainerDied","Data":"0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0"} Dec 03 08:36:36 crc kubenswrapper[4475]: I1203 08:36:36.136296 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58bfk" event={"ID":"fa596412-0693-4cfb-baf7-8c8dc0c56616","Type":"ContainerStarted","Data":"a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5"} Dec 03 08:36:36 crc kubenswrapper[4475]: I1203 08:36:36.154197 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-58bfk" podStartSLOduration=2.6827870110000003 podStartE2EDuration="5.154179843s" podCreationTimestamp="2025-12-03 08:36:31 +0000 UTC" firstStartedPulling="2025-12-03 08:36:33.111104231 +0000 UTC m=+6677.916002566" lastFinishedPulling="2025-12-03 08:36:35.582497064 +0000 UTC m=+6680.387395398" observedRunningTime="2025-12-03 08:36:36.149303986 +0000 UTC m=+6680.954202320" watchObservedRunningTime="2025-12-03 08:36:36.154179843 +0000 UTC m=+6680.959078178" Dec 03 08:36:41 crc kubenswrapper[4475]: I1203 08:36:41.826088 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:41 crc kubenswrapper[4475]: I1203 08:36:41.826479 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:41 crc kubenswrapper[4475]: I1203 08:36:41.864327 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:42 crc kubenswrapper[4475]: I1203 08:36:42.226427 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:42 crc kubenswrapper[4475]: I1203 08:36:42.274949 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58bfk"] Dec 03 08:36:44 crc kubenswrapper[4475]: I1203 08:36:44.200948 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-58bfk" podUID="fa596412-0693-4cfb-baf7-8c8dc0c56616" containerName="registry-server" containerID="cri-o://a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5" gracePeriod=2 Dec 03 08:36:44 crc kubenswrapper[4475]: I1203 08:36:44.726239 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:44 crc kubenswrapper[4475]: I1203 08:36:44.827469 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-utilities\") pod \"fa596412-0693-4cfb-baf7-8c8dc0c56616\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " Dec 03 08:36:44 crc kubenswrapper[4475]: I1203 08:36:44.827606 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-catalog-content\") pod \"fa596412-0693-4cfb-baf7-8c8dc0c56616\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " Dec 03 08:36:44 crc kubenswrapper[4475]: I1203 08:36:44.827874 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-utilities" (OuterVolumeSpecName: "utilities") pod "fa596412-0693-4cfb-baf7-8c8dc0c56616" (UID: "fa596412-0693-4cfb-baf7-8c8dc0c56616"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:36:44 crc kubenswrapper[4475]: I1203 08:36:44.827940 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7nz8\" (UniqueName: \"kubernetes.io/projected/fa596412-0693-4cfb-baf7-8c8dc0c56616-kube-api-access-z7nz8\") pod \"fa596412-0693-4cfb-baf7-8c8dc0c56616\" (UID: \"fa596412-0693-4cfb-baf7-8c8dc0c56616\") " Dec 03 08:36:44 crc kubenswrapper[4475]: I1203 08:36:44.829904 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:36:44 crc kubenswrapper[4475]: I1203 08:36:44.846037 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa596412-0693-4cfb-baf7-8c8dc0c56616-kube-api-access-z7nz8" (OuterVolumeSpecName: "kube-api-access-z7nz8") pod "fa596412-0693-4cfb-baf7-8c8dc0c56616" (UID: "fa596412-0693-4cfb-baf7-8c8dc0c56616"). InnerVolumeSpecName "kube-api-access-z7nz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:36:44 crc kubenswrapper[4475]: I1203 08:36:44.873903 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa596412-0693-4cfb-baf7-8c8dc0c56616" (UID: "fa596412-0693-4cfb-baf7-8c8dc0c56616"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:36:44 crc kubenswrapper[4475]: I1203 08:36:44.931596 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa596412-0693-4cfb-baf7-8c8dc0c56616-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:36:44 crc kubenswrapper[4475]: I1203 08:36:44.931625 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7nz8\" (UniqueName: \"kubernetes.io/projected/fa596412-0693-4cfb-baf7-8c8dc0c56616-kube-api-access-z7nz8\") on node \"crc\" DevicePath \"\"" Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.210222 4475 generic.go:334] "Generic (PLEG): container finished" podID="fa596412-0693-4cfb-baf7-8c8dc0c56616" containerID="a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5" exitCode=0 Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.210272 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58bfk" event={"ID":"fa596412-0693-4cfb-baf7-8c8dc0c56616","Type":"ContainerDied","Data":"a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5"} Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.210304 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58bfk" event={"ID":"fa596412-0693-4cfb-baf7-8c8dc0c56616","Type":"ContainerDied","Data":"222beaffc7c101dcce33786805cb14a487f4fa643d0eea4a06d7a6be08421997"} Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.210323 4475 scope.go:117] "RemoveContainer" containerID="a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5" Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.210478 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58bfk" Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.238259 4475 scope.go:117] "RemoveContainer" containerID="0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0" Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.250369 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58bfk"] Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.260201 4475 scope.go:117] "RemoveContainer" containerID="b4f831d1f8ed0d060f5bf101b125b4bd56a264c058eb907d0200eb19d3eb8e3b" Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.263681 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-58bfk"] Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.294407 4475 scope.go:117] "RemoveContainer" containerID="a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5" Dec 03 08:36:45 crc kubenswrapper[4475]: E1203 08:36:45.294741 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5\": container with ID starting with a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5 not found: ID does not exist" containerID="a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5" Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.294785 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5"} err="failed to get container status \"a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5\": rpc error: code = NotFound desc = could not find container \"a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5\": container with ID starting with a52d2349e17f8239b884c60188eed53e5e8a96dccab629bd49003e7dbe27e3a5 not found: ID does not exist" Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.294809 4475 scope.go:117] "RemoveContainer" containerID="0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0" Dec 03 08:36:45 crc kubenswrapper[4475]: E1203 08:36:45.295050 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0\": container with ID starting with 0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0 not found: ID does not exist" containerID="0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0" Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.295070 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0"} err="failed to get container status \"0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0\": rpc error: code = NotFound desc = could not find container \"0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0\": container with ID starting with 0d9be14f01b51857f6d150fb144f1662b8235bb2a425003dd96675a714ca52d0 not found: ID does not exist" Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.295085 4475 scope.go:117] "RemoveContainer" containerID="b4f831d1f8ed0d060f5bf101b125b4bd56a264c058eb907d0200eb19d3eb8e3b" Dec 03 08:36:45 crc kubenswrapper[4475]: E1203 08:36:45.295301 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4f831d1f8ed0d060f5bf101b125b4bd56a264c058eb907d0200eb19d3eb8e3b\": container with ID starting with b4f831d1f8ed0d060f5bf101b125b4bd56a264c058eb907d0200eb19d3eb8e3b not found: ID does not exist" containerID="b4f831d1f8ed0d060f5bf101b125b4bd56a264c058eb907d0200eb19d3eb8e3b" Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.295320 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f831d1f8ed0d060f5bf101b125b4bd56a264c058eb907d0200eb19d3eb8e3b"} err="failed to get container status \"b4f831d1f8ed0d060f5bf101b125b4bd56a264c058eb907d0200eb19d3eb8e3b\": rpc error: code = NotFound desc = could not find container \"b4f831d1f8ed0d060f5bf101b125b4bd56a264c058eb907d0200eb19d3eb8e3b\": container with ID starting with b4f831d1f8ed0d060f5bf101b125b4bd56a264c058eb907d0200eb19d3eb8e3b not found: ID does not exist" Dec 03 08:36:45 crc kubenswrapper[4475]: I1203 08:36:45.501008 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa596412-0693-4cfb-baf7-8c8dc0c56616" path="/var/lib/kubelet/pods/fa596412-0693-4cfb-baf7-8c8dc0c56616/volumes" Dec 03 08:36:58 crc kubenswrapper[4475]: I1203 08:36:58.933828 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:36:58 crc kubenswrapper[4475]: I1203 08:36:58.934576 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.408226 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p778x"] Dec 03 08:37:00 crc kubenswrapper[4475]: E1203 08:37:00.409040 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa596412-0693-4cfb-baf7-8c8dc0c56616" containerName="extract-content" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.409058 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa596412-0693-4cfb-baf7-8c8dc0c56616" containerName="extract-content" Dec 03 08:37:00 crc kubenswrapper[4475]: E1203 08:37:00.409084 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa596412-0693-4cfb-baf7-8c8dc0c56616" containerName="registry-server" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.409092 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa596412-0693-4cfb-baf7-8c8dc0c56616" containerName="registry-server" Dec 03 08:37:00 crc kubenswrapper[4475]: E1203 08:37:00.409112 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa596412-0693-4cfb-baf7-8c8dc0c56616" containerName="extract-utilities" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.409121 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa596412-0693-4cfb-baf7-8c8dc0c56616" containerName="extract-utilities" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.409424 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa596412-0693-4cfb-baf7-8c8dc0c56616" containerName="registry-server" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.411947 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.424369 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p778x"] Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.450428 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-utilities\") pod \"community-operators-p778x\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.450711 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-catalog-content\") pod \"community-operators-p778x\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.451093 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc4r5\" (UniqueName: \"kubernetes.io/projected/77bb03f4-4465-41a4-a321-2d8eef22438b-kube-api-access-sc4r5\") pod \"community-operators-p778x\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.553333 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-utilities\") pod \"community-operators-p778x\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.553624 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-catalog-content\") pod \"community-operators-p778x\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.553978 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc4r5\" (UniqueName: \"kubernetes.io/projected/77bb03f4-4465-41a4-a321-2d8eef22438b-kube-api-access-sc4r5\") pod \"community-operators-p778x\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.553959 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-utilities\") pod \"community-operators-p778x\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.554538 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-catalog-content\") pod \"community-operators-p778x\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.574469 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc4r5\" (UniqueName: \"kubernetes.io/projected/77bb03f4-4465-41a4-a321-2d8eef22438b-kube-api-access-sc4r5\") pod \"community-operators-p778x\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:00 crc kubenswrapper[4475]: I1203 08:37:00.735686 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:01 crc kubenswrapper[4475]: I1203 08:37:01.240872 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p778x"] Dec 03 08:37:01 crc kubenswrapper[4475]: I1203 08:37:01.346015 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p778x" event={"ID":"77bb03f4-4465-41a4-a321-2d8eef22438b","Type":"ContainerStarted","Data":"cf9fa173cf86c45dcb4de3133a25b556dad24968f196f12efd8984f19fc3582a"} Dec 03 08:37:02 crc kubenswrapper[4475]: I1203 08:37:02.357930 4475 generic.go:334] "Generic (PLEG): container finished" podID="77bb03f4-4465-41a4-a321-2d8eef22438b" containerID="c5fe221b1488356250b3408fbdbb053f965b6b64acab58231149a5b914848558" exitCode=0 Dec 03 08:37:02 crc kubenswrapper[4475]: I1203 08:37:02.357988 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p778x" event={"ID":"77bb03f4-4465-41a4-a321-2d8eef22438b","Type":"ContainerDied","Data":"c5fe221b1488356250b3408fbdbb053f965b6b64acab58231149a5b914848558"} Dec 03 08:37:03 crc kubenswrapper[4475]: I1203 08:37:03.370366 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p778x" event={"ID":"77bb03f4-4465-41a4-a321-2d8eef22438b","Type":"ContainerStarted","Data":"43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021"} Dec 03 08:37:04 crc kubenswrapper[4475]: I1203 08:37:04.381432 4475 generic.go:334] "Generic (PLEG): container finished" podID="77bb03f4-4465-41a4-a321-2d8eef22438b" containerID="43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021" exitCode=0 Dec 03 08:37:04 crc kubenswrapper[4475]: I1203 08:37:04.381577 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p778x" event={"ID":"77bb03f4-4465-41a4-a321-2d8eef22438b","Type":"ContainerDied","Data":"43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021"} Dec 03 08:37:05 crc kubenswrapper[4475]: I1203 08:37:05.394826 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p778x" event={"ID":"77bb03f4-4465-41a4-a321-2d8eef22438b","Type":"ContainerStarted","Data":"158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8"} Dec 03 08:37:05 crc kubenswrapper[4475]: I1203 08:37:05.411669 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p778x" podStartSLOduration=2.698448375 podStartE2EDuration="5.411643591s" podCreationTimestamp="2025-12-03 08:37:00 +0000 UTC" firstStartedPulling="2025-12-03 08:37:02.361705266 +0000 UTC m=+6707.166603600" lastFinishedPulling="2025-12-03 08:37:05.074900482 +0000 UTC m=+6709.879798816" observedRunningTime="2025-12-03 08:37:05.41020739 +0000 UTC m=+6710.215105724" watchObservedRunningTime="2025-12-03 08:37:05.411643591 +0000 UTC m=+6710.216541925" Dec 03 08:37:10 crc kubenswrapper[4475]: I1203 08:37:10.736823 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:10 crc kubenswrapper[4475]: I1203 08:37:10.737733 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:10 crc kubenswrapper[4475]: I1203 08:37:10.777676 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:11 crc kubenswrapper[4475]: I1203 08:37:11.488922 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:11 crc kubenswrapper[4475]: I1203 08:37:11.538736 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p778x"] Dec 03 08:37:13 crc kubenswrapper[4475]: I1203 08:37:13.467073 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p778x" podUID="77bb03f4-4465-41a4-a321-2d8eef22438b" containerName="registry-server" containerID="cri-o://158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8" gracePeriod=2 Dec 03 08:37:13 crc kubenswrapper[4475]: I1203 08:37:13.916237 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.029260 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-catalog-content\") pod \"77bb03f4-4465-41a4-a321-2d8eef22438b\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.029679 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc4r5\" (UniqueName: \"kubernetes.io/projected/77bb03f4-4465-41a4-a321-2d8eef22438b-kube-api-access-sc4r5\") pod \"77bb03f4-4465-41a4-a321-2d8eef22438b\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.029830 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-utilities\") pod \"77bb03f4-4465-41a4-a321-2d8eef22438b\" (UID: \"77bb03f4-4465-41a4-a321-2d8eef22438b\") " Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.030896 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-utilities" (OuterVolumeSpecName: "utilities") pod "77bb03f4-4465-41a4-a321-2d8eef22438b" (UID: "77bb03f4-4465-41a4-a321-2d8eef22438b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.042127 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bb03f4-4465-41a4-a321-2d8eef22438b-kube-api-access-sc4r5" (OuterVolumeSpecName: "kube-api-access-sc4r5") pod "77bb03f4-4465-41a4-a321-2d8eef22438b" (UID: "77bb03f4-4465-41a4-a321-2d8eef22438b"). InnerVolumeSpecName "kube-api-access-sc4r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.073011 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77bb03f4-4465-41a4-a321-2d8eef22438b" (UID: "77bb03f4-4465-41a4-a321-2d8eef22438b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.131729 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc4r5\" (UniqueName: \"kubernetes.io/projected/77bb03f4-4465-41a4-a321-2d8eef22438b-kube-api-access-sc4r5\") on node \"crc\" DevicePath \"\"" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.131762 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.131772 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77bb03f4-4465-41a4-a321-2d8eef22438b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.475194 4475 generic.go:334] "Generic (PLEG): container finished" podID="77bb03f4-4465-41a4-a321-2d8eef22438b" containerID="158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8" exitCode=0 Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.475253 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p778x" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.475245 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p778x" event={"ID":"77bb03f4-4465-41a4-a321-2d8eef22438b","Type":"ContainerDied","Data":"158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8"} Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.476202 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p778x" event={"ID":"77bb03f4-4465-41a4-a321-2d8eef22438b","Type":"ContainerDied","Data":"cf9fa173cf86c45dcb4de3133a25b556dad24968f196f12efd8984f19fc3582a"} Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.476295 4475 scope.go:117] "RemoveContainer" containerID="158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.500144 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p778x"] Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.504228 4475 scope.go:117] "RemoveContainer" containerID="43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.506960 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p778x"] Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.521390 4475 scope.go:117] "RemoveContainer" containerID="c5fe221b1488356250b3408fbdbb053f965b6b64acab58231149a5b914848558" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.558849 4475 scope.go:117] "RemoveContainer" containerID="158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8" Dec 03 08:37:14 crc kubenswrapper[4475]: E1203 08:37:14.559331 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8\": container with ID starting with 158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8 not found: ID does not exist" containerID="158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.559371 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8"} err="failed to get container status \"158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8\": rpc error: code = NotFound desc = could not find container \"158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8\": container with ID starting with 158a19a7dea32f3d80a23b7f59eb49cde5dfd527085b9a8df823464033e05db8 not found: ID does not exist" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.559418 4475 scope.go:117] "RemoveContainer" containerID="43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021" Dec 03 08:37:14 crc kubenswrapper[4475]: E1203 08:37:14.559851 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021\": container with ID starting with 43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021 not found: ID does not exist" containerID="43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.559892 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021"} err="failed to get container status \"43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021\": rpc error: code = NotFound desc = could not find container \"43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021\": container with ID starting with 43728eb882b8ecb5cef67fce04d7f85b8e94aa07dc1aff06a2edd9b5a0de7021 not found: ID does not exist" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.559907 4475 scope.go:117] "RemoveContainer" containerID="c5fe221b1488356250b3408fbdbb053f965b6b64acab58231149a5b914848558" Dec 03 08:37:14 crc kubenswrapper[4475]: E1203 08:37:14.560220 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5fe221b1488356250b3408fbdbb053f965b6b64acab58231149a5b914848558\": container with ID starting with c5fe221b1488356250b3408fbdbb053f965b6b64acab58231149a5b914848558 not found: ID does not exist" containerID="c5fe221b1488356250b3408fbdbb053f965b6b64acab58231149a5b914848558" Dec 03 08:37:14 crc kubenswrapper[4475]: I1203 08:37:14.560261 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5fe221b1488356250b3408fbdbb053f965b6b64acab58231149a5b914848558"} err="failed to get container status \"c5fe221b1488356250b3408fbdbb053f965b6b64acab58231149a5b914848558\": rpc error: code = NotFound desc = could not find container \"c5fe221b1488356250b3408fbdbb053f965b6b64acab58231149a5b914848558\": container with ID starting with c5fe221b1488356250b3408fbdbb053f965b6b64acab58231149a5b914848558 not found: ID does not exist" Dec 03 08:37:15 crc kubenswrapper[4475]: I1203 08:37:15.504233 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77bb03f4-4465-41a4-a321-2d8eef22438b" path="/var/lib/kubelet/pods/77bb03f4-4465-41a4-a321-2d8eef22438b/volumes" Dec 03 08:37:20 crc kubenswrapper[4475]: I1203 08:37:20.886700 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pxvkn"] Dec 03 08:37:20 crc kubenswrapper[4475]: E1203 08:37:20.887491 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bb03f4-4465-41a4-a321-2d8eef22438b" containerName="extract-utilities" Dec 03 08:37:20 crc kubenswrapper[4475]: I1203 08:37:20.887505 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bb03f4-4465-41a4-a321-2d8eef22438b" containerName="extract-utilities" Dec 03 08:37:20 crc kubenswrapper[4475]: E1203 08:37:20.887531 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bb03f4-4465-41a4-a321-2d8eef22438b" containerName="extract-content" Dec 03 08:37:20 crc kubenswrapper[4475]: I1203 08:37:20.887536 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bb03f4-4465-41a4-a321-2d8eef22438b" containerName="extract-content" Dec 03 08:37:20 crc kubenswrapper[4475]: E1203 08:37:20.887547 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bb03f4-4465-41a4-a321-2d8eef22438b" containerName="registry-server" Dec 03 08:37:20 crc kubenswrapper[4475]: I1203 08:37:20.887552 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bb03f4-4465-41a4-a321-2d8eef22438b" containerName="registry-server" Dec 03 08:37:20 crc kubenswrapper[4475]: I1203 08:37:20.887716 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="77bb03f4-4465-41a4-a321-2d8eef22438b" containerName="registry-server" Dec 03 08:37:20 crc kubenswrapper[4475]: I1203 08:37:20.888939 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:20 crc kubenswrapper[4475]: I1203 08:37:20.904852 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pxvkn"] Dec 03 08:37:20 crc kubenswrapper[4475]: I1203 08:37:20.979892 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlbfb\" (UniqueName: \"kubernetes.io/projected/b89878fb-3274-4a9d-97de-3a9cf657d97b-kube-api-access-zlbfb\") pod \"redhat-operators-pxvkn\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:20 crc kubenswrapper[4475]: I1203 08:37:20.980131 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-catalog-content\") pod \"redhat-operators-pxvkn\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:20 crc kubenswrapper[4475]: I1203 08:37:20.980356 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-utilities\") pod \"redhat-operators-pxvkn\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:21 crc kubenswrapper[4475]: I1203 08:37:21.081854 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-catalog-content\") pod \"redhat-operators-pxvkn\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:21 crc kubenswrapper[4475]: I1203 08:37:21.082021 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-utilities\") pod \"redhat-operators-pxvkn\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:21 crc kubenswrapper[4475]: I1203 08:37:21.082096 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlbfb\" (UniqueName: \"kubernetes.io/projected/b89878fb-3274-4a9d-97de-3a9cf657d97b-kube-api-access-zlbfb\") pod \"redhat-operators-pxvkn\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:21 crc kubenswrapper[4475]: I1203 08:37:21.082237 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-catalog-content\") pod \"redhat-operators-pxvkn\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:21 crc kubenswrapper[4475]: I1203 08:37:21.082421 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-utilities\") pod \"redhat-operators-pxvkn\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:21 crc kubenswrapper[4475]: I1203 08:37:21.101634 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlbfb\" (UniqueName: \"kubernetes.io/projected/b89878fb-3274-4a9d-97de-3a9cf657d97b-kube-api-access-zlbfb\") pod \"redhat-operators-pxvkn\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:21 crc kubenswrapper[4475]: I1203 08:37:21.204237 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:21 crc kubenswrapper[4475]: I1203 08:37:21.782942 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pxvkn"] Dec 03 08:37:22 crc kubenswrapper[4475]: I1203 08:37:22.546022 4475 generic.go:334] "Generic (PLEG): container finished" podID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerID="f8006e7b82ca31f9b532cb8c0468dfdbb7854fd7badc57a10032f088da814e45" exitCode=0 Dec 03 08:37:22 crc kubenswrapper[4475]: I1203 08:37:22.546127 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxvkn" event={"ID":"b89878fb-3274-4a9d-97de-3a9cf657d97b","Type":"ContainerDied","Data":"f8006e7b82ca31f9b532cb8c0468dfdbb7854fd7badc57a10032f088da814e45"} Dec 03 08:37:22 crc kubenswrapper[4475]: I1203 08:37:22.546391 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxvkn" event={"ID":"b89878fb-3274-4a9d-97de-3a9cf657d97b","Type":"ContainerStarted","Data":"1ccf10ffb7f7bfaa0882759968b3f7026a99aabbe377d9749ec4c81f225b15d8"} Dec 03 08:37:23 crc kubenswrapper[4475]: I1203 08:37:23.554241 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxvkn" event={"ID":"b89878fb-3274-4a9d-97de-3a9cf657d97b","Type":"ContainerStarted","Data":"e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a"} Dec 03 08:37:25 crc kubenswrapper[4475]: I1203 08:37:25.570087 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxvkn" event={"ID":"b89878fb-3274-4a9d-97de-3a9cf657d97b","Type":"ContainerDied","Data":"e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a"} Dec 03 08:37:25 crc kubenswrapper[4475]: I1203 08:37:25.570367 4475 generic.go:334] "Generic (PLEG): container finished" podID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerID="e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a" exitCode=0 Dec 03 08:37:26 crc kubenswrapper[4475]: I1203 08:37:26.579248 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxvkn" event={"ID":"b89878fb-3274-4a9d-97de-3a9cf657d97b","Type":"ContainerStarted","Data":"ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5"} Dec 03 08:37:26 crc kubenswrapper[4475]: I1203 08:37:26.597499 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pxvkn" podStartSLOduration=3.065811508 podStartE2EDuration="6.597483284s" podCreationTimestamp="2025-12-03 08:37:20 +0000 UTC" firstStartedPulling="2025-12-03 08:37:22.548585493 +0000 UTC m=+6727.353483827" lastFinishedPulling="2025-12-03 08:37:26.080257278 +0000 UTC m=+6730.885155603" observedRunningTime="2025-12-03 08:37:26.59324188 +0000 UTC m=+6731.398140224" watchObservedRunningTime="2025-12-03 08:37:26.597483284 +0000 UTC m=+6731.402381619" Dec 03 08:37:28 crc kubenswrapper[4475]: I1203 08:37:28.933805 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:37:28 crc kubenswrapper[4475]: I1203 08:37:28.934049 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:37:28 crc kubenswrapper[4475]: I1203 08:37:28.934085 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 08:37:28 crc kubenswrapper[4475]: I1203 08:37:28.934615 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:37:28 crc kubenswrapper[4475]: I1203 08:37:28.934663 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" gracePeriod=600 Dec 03 08:37:29 crc kubenswrapper[4475]: E1203 08:37:29.069621 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:37:29 crc kubenswrapper[4475]: I1203 08:37:29.602540 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" exitCode=0 Dec 03 08:37:29 crc kubenswrapper[4475]: I1203 08:37:29.602588 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec"} Dec 03 08:37:29 crc kubenswrapper[4475]: I1203 08:37:29.602632 4475 scope.go:117] "RemoveContainer" containerID="5e4bdfeb75b6a6f33140859fefdf48e5e0055c78f6936aca52f1be6e00372076" Dec 03 08:37:29 crc kubenswrapper[4475]: I1203 08:37:29.603150 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:37:29 crc kubenswrapper[4475]: E1203 08:37:29.603480 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:37:31 crc kubenswrapper[4475]: I1203 08:37:31.205171 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:31 crc kubenswrapper[4475]: I1203 08:37:31.205393 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:32 crc kubenswrapper[4475]: I1203 08:37:32.239412 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pxvkn" podUID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerName="registry-server" probeResult="failure" output=< Dec 03 08:37:32 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 08:37:32 crc kubenswrapper[4475]: > Dec 03 08:37:41 crc kubenswrapper[4475]: I1203 08:37:41.247003 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:41 crc kubenswrapper[4475]: I1203 08:37:41.282942 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:41 crc kubenswrapper[4475]: I1203 08:37:41.475959 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pxvkn"] Dec 03 08:37:42 crc kubenswrapper[4475]: I1203 08:37:42.700890 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pxvkn" podUID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerName="registry-server" containerID="cri-o://ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5" gracePeriod=2 Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.217599 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.374491 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-utilities\") pod \"b89878fb-3274-4a9d-97de-3a9cf657d97b\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.374676 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlbfb\" (UniqueName: \"kubernetes.io/projected/b89878fb-3274-4a9d-97de-3a9cf657d97b-kube-api-access-zlbfb\") pod \"b89878fb-3274-4a9d-97de-3a9cf657d97b\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.374701 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-catalog-content\") pod \"b89878fb-3274-4a9d-97de-3a9cf657d97b\" (UID: \"b89878fb-3274-4a9d-97de-3a9cf657d97b\") " Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.374942 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-utilities" (OuterVolumeSpecName: "utilities") pod "b89878fb-3274-4a9d-97de-3a9cf657d97b" (UID: "b89878fb-3274-4a9d-97de-3a9cf657d97b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.375297 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.390962 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89878fb-3274-4a9d-97de-3a9cf657d97b-kube-api-access-zlbfb" (OuterVolumeSpecName: "kube-api-access-zlbfb") pod "b89878fb-3274-4a9d-97de-3a9cf657d97b" (UID: "b89878fb-3274-4a9d-97de-3a9cf657d97b"). InnerVolumeSpecName "kube-api-access-zlbfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.458033 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b89878fb-3274-4a9d-97de-3a9cf657d97b" (UID: "b89878fb-3274-4a9d-97de-3a9cf657d97b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.477618 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlbfb\" (UniqueName: \"kubernetes.io/projected/b89878fb-3274-4a9d-97de-3a9cf657d97b-kube-api-access-zlbfb\") on node \"crc\" DevicePath \"\"" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.477641 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b89878fb-3274-4a9d-97de-3a9cf657d97b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.492565 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:37:43 crc kubenswrapper[4475]: E1203 08:37:43.492783 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.710517 4475 generic.go:334] "Generic (PLEG): container finished" podID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerID="ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5" exitCode=0 Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.710561 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxvkn" event={"ID":"b89878fb-3274-4a9d-97de-3a9cf657d97b","Type":"ContainerDied","Data":"ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5"} Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.710588 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxvkn" event={"ID":"b89878fb-3274-4a9d-97de-3a9cf657d97b","Type":"ContainerDied","Data":"1ccf10ffb7f7bfaa0882759968b3f7026a99aabbe377d9749ec4c81f225b15d8"} Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.710591 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxvkn" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.710604 4475 scope.go:117] "RemoveContainer" containerID="ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.734965 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pxvkn"] Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.735632 4475 scope.go:117] "RemoveContainer" containerID="e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.747602 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pxvkn"] Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.759107 4475 scope.go:117] "RemoveContainer" containerID="f8006e7b82ca31f9b532cb8c0468dfdbb7854fd7badc57a10032f088da814e45" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.792395 4475 scope.go:117] "RemoveContainer" containerID="ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5" Dec 03 08:37:43 crc kubenswrapper[4475]: E1203 08:37:43.792784 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5\": container with ID starting with ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5 not found: ID does not exist" containerID="ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.792814 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5"} err="failed to get container status \"ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5\": rpc error: code = NotFound desc = could not find container \"ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5\": container with ID starting with ae1aed1d02607be9a78bcafaa568277f5c32fe9c302e6ca0f8679ce6b44d14e5 not found: ID does not exist" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.792839 4475 scope.go:117] "RemoveContainer" containerID="e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a" Dec 03 08:37:43 crc kubenswrapper[4475]: E1203 08:37:43.793813 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a\": container with ID starting with e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a not found: ID does not exist" containerID="e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.793835 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a"} err="failed to get container status \"e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a\": rpc error: code = NotFound desc = could not find container \"e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a\": container with ID starting with e381e334db7b1da330cd0e61719ce4fcf260e5f2c515df9a310f427ea877702a not found: ID does not exist" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.793854 4475 scope.go:117] "RemoveContainer" containerID="f8006e7b82ca31f9b532cb8c0468dfdbb7854fd7badc57a10032f088da814e45" Dec 03 08:37:43 crc kubenswrapper[4475]: E1203 08:37:43.794064 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8006e7b82ca31f9b532cb8c0468dfdbb7854fd7badc57a10032f088da814e45\": container with ID starting with f8006e7b82ca31f9b532cb8c0468dfdbb7854fd7badc57a10032f088da814e45 not found: ID does not exist" containerID="f8006e7b82ca31f9b532cb8c0468dfdbb7854fd7badc57a10032f088da814e45" Dec 03 08:37:43 crc kubenswrapper[4475]: I1203 08:37:43.794080 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8006e7b82ca31f9b532cb8c0468dfdbb7854fd7badc57a10032f088da814e45"} err="failed to get container status \"f8006e7b82ca31f9b532cb8c0468dfdbb7854fd7badc57a10032f088da814e45\": rpc error: code = NotFound desc = could not find container \"f8006e7b82ca31f9b532cb8c0468dfdbb7854fd7badc57a10032f088da814e45\": container with ID starting with f8006e7b82ca31f9b532cb8c0468dfdbb7854fd7badc57a10032f088da814e45 not found: ID does not exist" Dec 03 08:37:45 crc kubenswrapper[4475]: I1203 08:37:45.499157 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b89878fb-3274-4a9d-97de-3a9cf657d97b" path="/var/lib/kubelet/pods/b89878fb-3274-4a9d-97de-3a9cf657d97b/volumes" Dec 03 08:37:58 crc kubenswrapper[4475]: I1203 08:37:58.491998 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:37:58 crc kubenswrapper[4475]: E1203 08:37:58.492746 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:38:09 crc kubenswrapper[4475]: I1203 08:38:09.491210 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:38:09 crc kubenswrapper[4475]: E1203 08:38:09.492075 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:38:20 crc kubenswrapper[4475]: I1203 08:38:20.492575 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:38:20 crc kubenswrapper[4475]: E1203 08:38:20.494335 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:38:32 crc kubenswrapper[4475]: I1203 08:38:32.492159 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:38:32 crc kubenswrapper[4475]: E1203 08:38:32.493202 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:38:47 crc kubenswrapper[4475]: I1203 08:38:47.492142 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:38:47 crc kubenswrapper[4475]: E1203 08:38:47.492971 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:39:02 crc kubenswrapper[4475]: I1203 08:39:02.492441 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:39:02 crc kubenswrapper[4475]: E1203 08:39:02.493497 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:39:13 crc kubenswrapper[4475]: I1203 08:39:13.492040 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:39:13 crc kubenswrapper[4475]: E1203 08:39:13.493137 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:39:24 crc kubenswrapper[4475]: I1203 08:39:24.491495 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:39:24 crc kubenswrapper[4475]: E1203 08:39:24.493507 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:39:39 crc kubenswrapper[4475]: I1203 08:39:39.491119 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:39:39 crc kubenswrapper[4475]: E1203 08:39:39.492101 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:39:51 crc kubenswrapper[4475]: I1203 08:39:51.490906 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:39:51 crc kubenswrapper[4475]: E1203 08:39:51.491530 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:40:03 crc kubenswrapper[4475]: I1203 08:40:03.492242 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:40:03 crc kubenswrapper[4475]: E1203 08:40:03.493295 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:40:18 crc kubenswrapper[4475]: I1203 08:40:18.493524 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:40:18 crc kubenswrapper[4475]: E1203 08:40:18.494577 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:40:29 crc kubenswrapper[4475]: I1203 08:40:29.492111 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:40:29 crc kubenswrapper[4475]: E1203 08:40:29.494395 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:40:43 crc kubenswrapper[4475]: I1203 08:40:43.491697 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:40:43 crc kubenswrapper[4475]: E1203 08:40:43.492619 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:40:54 crc kubenswrapper[4475]: I1203 08:40:54.491013 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:40:54 crc kubenswrapper[4475]: E1203 08:40:54.491972 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:41:08 crc kubenswrapper[4475]: I1203 08:41:08.491726 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:41:08 crc kubenswrapper[4475]: E1203 08:41:08.493031 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:41:20 crc kubenswrapper[4475]: I1203 08:41:20.491738 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:41:20 crc kubenswrapper[4475]: E1203 08:41:20.492821 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:41:31 crc kubenswrapper[4475]: I1203 08:41:31.491534 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:41:31 crc kubenswrapper[4475]: E1203 08:41:31.492491 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:41:43 crc kubenswrapper[4475]: I1203 08:41:43.491426 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:41:43 crc kubenswrapper[4475]: E1203 08:41:43.492381 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:41:54 crc kubenswrapper[4475]: I1203 08:41:54.491636 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:41:54 crc kubenswrapper[4475]: E1203 08:41:54.493610 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:42:09 crc kubenswrapper[4475]: I1203 08:42:09.492635 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:42:09 crc kubenswrapper[4475]: E1203 08:42:09.493645 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:42:22 crc kubenswrapper[4475]: I1203 08:42:22.492143 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:42:22 crc kubenswrapper[4475]: E1203 08:42:22.493153 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:42:35 crc kubenswrapper[4475]: I1203 08:42:35.498885 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:42:36 crc kubenswrapper[4475]: I1203 08:42:36.347481 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"14097464da453782e9a13b91bd60d4c37dba9b7e953f3de436197834349ae001"} Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.710041 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jjmtm"] Dec 03 08:44:58 crc kubenswrapper[4475]: E1203 08:44:58.717438 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerName="extract-content" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.717525 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerName="extract-content" Dec 03 08:44:58 crc kubenswrapper[4475]: E1203 08:44:58.717568 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerName="extract-utilities" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.717576 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerName="extract-utilities" Dec 03 08:44:58 crc kubenswrapper[4475]: E1203 08:44:58.717627 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerName="registry-server" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.717634 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerName="registry-server" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.722078 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89878fb-3274-4a9d-97de-3a9cf657d97b" containerName="registry-server" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.726348 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.751803 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjmtm"] Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.833747 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-catalog-content\") pod \"redhat-marketplace-jjmtm\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.833833 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-utilities\") pod \"redhat-marketplace-jjmtm\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.834049 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmbgb\" (UniqueName: \"kubernetes.io/projected/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-kube-api-access-tmbgb\") pod \"redhat-marketplace-jjmtm\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.934052 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.934572 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.935975 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-catalog-content\") pod \"redhat-marketplace-jjmtm\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.936045 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-utilities\") pod \"redhat-marketplace-jjmtm\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.936275 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmbgb\" (UniqueName: \"kubernetes.io/projected/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-kube-api-access-tmbgb\") pod \"redhat-marketplace-jjmtm\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.938076 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-catalog-content\") pod \"redhat-marketplace-jjmtm\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.938160 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-utilities\") pod \"redhat-marketplace-jjmtm\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:44:58 crc kubenswrapper[4475]: I1203 08:44:58.968253 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmbgb\" (UniqueName: \"kubernetes.io/projected/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-kube-api-access-tmbgb\") pod \"redhat-marketplace-jjmtm\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:44:59 crc kubenswrapper[4475]: I1203 08:44:59.097345 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:44:59 crc kubenswrapper[4475]: I1203 08:44:59.884105 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjmtm"] Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.191478 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh"] Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.193178 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.208956 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh"] Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.212206 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.212212 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.278385 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be454d29-1abe-487c-a609-d6c2e5563726-config-volume\") pod \"collect-profiles-29412525-th4nh\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.278806 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be454d29-1abe-487c-a609-d6c2e5563726-secret-volume\") pod \"collect-profiles-29412525-th4nh\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.278890 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m866p\" (UniqueName: \"kubernetes.io/projected/be454d29-1abe-487c-a609-d6c2e5563726-kube-api-access-m866p\") pod \"collect-profiles-29412525-th4nh\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.382519 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be454d29-1abe-487c-a609-d6c2e5563726-secret-volume\") pod \"collect-profiles-29412525-th4nh\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.382565 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m866p\" (UniqueName: \"kubernetes.io/projected/be454d29-1abe-487c-a609-d6c2e5563726-kube-api-access-m866p\") pod \"collect-profiles-29412525-th4nh\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.382907 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be454d29-1abe-487c-a609-d6c2e5563726-config-volume\") pod \"collect-profiles-29412525-th4nh\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.386090 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be454d29-1abe-487c-a609-d6c2e5563726-config-volume\") pod \"collect-profiles-29412525-th4nh\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.391081 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be454d29-1abe-487c-a609-d6c2e5563726-secret-volume\") pod \"collect-profiles-29412525-th4nh\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.403749 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m866p\" (UniqueName: \"kubernetes.io/projected/be454d29-1abe-487c-a609-d6c2e5563726-kube-api-access-m866p\") pod \"collect-profiles-29412525-th4nh\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.517309 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.757392 4475 generic.go:334] "Generic (PLEG): container finished" podID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" containerID="d06fa44d686518232d7f08c85984c89caaabb9e2720972dd63912e97db9a8061" exitCode=0 Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.757716 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjmtm" event={"ID":"9da9f3f1-6eff-43b8-bc20-34f3a15630ed","Type":"ContainerDied","Data":"d06fa44d686518232d7f08c85984c89caaabb9e2720972dd63912e97db9a8061"} Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.757807 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjmtm" event={"ID":"9da9f3f1-6eff-43b8-bc20-34f3a15630ed","Type":"ContainerStarted","Data":"e3843fcdaebdf5bdd994400d0e00ab52ffb3cee52293abc41d485c373bcc059e"} Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.762077 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:45:00 crc kubenswrapper[4475]: I1203 08:45:00.968013 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh"] Dec 03 08:45:01 crc kubenswrapper[4475]: I1203 08:45:01.768918 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjmtm" event={"ID":"9da9f3f1-6eff-43b8-bc20-34f3a15630ed","Type":"ContainerStarted","Data":"319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898"} Dec 03 08:45:01 crc kubenswrapper[4475]: I1203 08:45:01.771591 4475 generic.go:334] "Generic (PLEG): container finished" podID="be454d29-1abe-487c-a609-d6c2e5563726" containerID="60cca535a82d608a7be85a0e0abc61ce27c5ee7e26a10bd3cbf6983502f16b1e" exitCode=0 Dec 03 08:45:01 crc kubenswrapper[4475]: I1203 08:45:01.771628 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" event={"ID":"be454d29-1abe-487c-a609-d6c2e5563726","Type":"ContainerDied","Data":"60cca535a82d608a7be85a0e0abc61ce27c5ee7e26a10bd3cbf6983502f16b1e"} Dec 03 08:45:01 crc kubenswrapper[4475]: I1203 08:45:01.771648 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" event={"ID":"be454d29-1abe-487c-a609-d6c2e5563726","Type":"ContainerStarted","Data":"8d78d39df527d5b5e15c62901fd4a05ab1235bfc7a311951423339fa9b1c26c3"} Dec 03 08:45:02 crc kubenswrapper[4475]: I1203 08:45:02.781439 4475 generic.go:334] "Generic (PLEG): container finished" podID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" containerID="319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898" exitCode=0 Dec 03 08:45:02 crc kubenswrapper[4475]: I1203 08:45:02.781556 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjmtm" event={"ID":"9da9f3f1-6eff-43b8-bc20-34f3a15630ed","Type":"ContainerDied","Data":"319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898"} Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.117024 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.260727 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m866p\" (UniqueName: \"kubernetes.io/projected/be454d29-1abe-487c-a609-d6c2e5563726-kube-api-access-m866p\") pod \"be454d29-1abe-487c-a609-d6c2e5563726\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.260912 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be454d29-1abe-487c-a609-d6c2e5563726-secret-volume\") pod \"be454d29-1abe-487c-a609-d6c2e5563726\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.261030 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be454d29-1abe-487c-a609-d6c2e5563726-config-volume\") pod \"be454d29-1abe-487c-a609-d6c2e5563726\" (UID: \"be454d29-1abe-487c-a609-d6c2e5563726\") " Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.263989 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be454d29-1abe-487c-a609-d6c2e5563726-config-volume" (OuterVolumeSpecName: "config-volume") pod "be454d29-1abe-487c-a609-d6c2e5563726" (UID: "be454d29-1abe-487c-a609-d6c2e5563726"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.271692 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be454d29-1abe-487c-a609-d6c2e5563726-kube-api-access-m866p" (OuterVolumeSpecName: "kube-api-access-m866p") pod "be454d29-1abe-487c-a609-d6c2e5563726" (UID: "be454d29-1abe-487c-a609-d6c2e5563726"). InnerVolumeSpecName "kube-api-access-m866p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.272566 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be454d29-1abe-487c-a609-d6c2e5563726-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "be454d29-1abe-487c-a609-d6c2e5563726" (UID: "be454d29-1abe-487c-a609-d6c2e5563726"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.365591 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m866p\" (UniqueName: \"kubernetes.io/projected/be454d29-1abe-487c-a609-d6c2e5563726-kube-api-access-m866p\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.365832 4475 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be454d29-1abe-487c-a609-d6c2e5563726-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.365845 4475 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be454d29-1abe-487c-a609-d6c2e5563726-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.793808 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjmtm" event={"ID":"9da9f3f1-6eff-43b8-bc20-34f3a15630ed","Type":"ContainerStarted","Data":"207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e"} Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.795628 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" event={"ID":"be454d29-1abe-487c-a609-d6c2e5563726","Type":"ContainerDied","Data":"8d78d39df527d5b5e15c62901fd4a05ab1235bfc7a311951423339fa9b1c26c3"} Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.795655 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-th4nh" Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.795689 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d78d39df527d5b5e15c62901fd4a05ab1235bfc7a311951423339fa9b1c26c3" Dec 03 08:45:03 crc kubenswrapper[4475]: I1203 08:45:03.817949 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jjmtm" podStartSLOduration=3.324385518 podStartE2EDuration="5.817927277s" podCreationTimestamp="2025-12-03 08:44:58 +0000 UTC" firstStartedPulling="2025-12-03 08:45:00.760730039 +0000 UTC m=+7185.565628374" lastFinishedPulling="2025-12-03 08:45:03.254271799 +0000 UTC m=+7188.059170133" observedRunningTime="2025-12-03 08:45:03.811181935 +0000 UTC m=+7188.616080268" watchObservedRunningTime="2025-12-03 08:45:03.817927277 +0000 UTC m=+7188.622825612" Dec 03 08:45:04 crc kubenswrapper[4475]: I1203 08:45:04.235567 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl"] Dec 03 08:45:04 crc kubenswrapper[4475]: I1203 08:45:04.236740 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-5ctgl"] Dec 03 08:45:05 crc kubenswrapper[4475]: I1203 08:45:05.501984 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ec5724-5d40-47a7-b078-cba9149cd04d" path="/var/lib/kubelet/pods/c2ec5724-5d40-47a7-b078-cba9149cd04d/volumes" Dec 03 08:45:09 crc kubenswrapper[4475]: I1203 08:45:09.098442 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:45:09 crc kubenswrapper[4475]: I1203 08:45:09.098948 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:45:09 crc kubenswrapper[4475]: I1203 08:45:09.133911 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:45:09 crc kubenswrapper[4475]: I1203 08:45:09.893204 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:45:09 crc kubenswrapper[4475]: I1203 08:45:09.953201 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjmtm"] Dec 03 08:45:11 crc kubenswrapper[4475]: I1203 08:45:11.866046 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jjmtm" podUID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" containerName="registry-server" containerID="cri-o://207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e" gracePeriod=2 Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.360022 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.478664 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-catalog-content\") pod \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.478789 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmbgb\" (UniqueName: \"kubernetes.io/projected/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-kube-api-access-tmbgb\") pod \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.478950 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-utilities\") pod \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\" (UID: \"9da9f3f1-6eff-43b8-bc20-34f3a15630ed\") " Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.479638 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-utilities" (OuterVolumeSpecName: "utilities") pod "9da9f3f1-6eff-43b8-bc20-34f3a15630ed" (UID: "9da9f3f1-6eff-43b8-bc20-34f3a15630ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.480240 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.494884 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-kube-api-access-tmbgb" (OuterVolumeSpecName: "kube-api-access-tmbgb") pod "9da9f3f1-6eff-43b8-bc20-34f3a15630ed" (UID: "9da9f3f1-6eff-43b8-bc20-34f3a15630ed"). InnerVolumeSpecName "kube-api-access-tmbgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.496941 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9da9f3f1-6eff-43b8-bc20-34f3a15630ed" (UID: "9da9f3f1-6eff-43b8-bc20-34f3a15630ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.581614 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.581649 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmbgb\" (UniqueName: \"kubernetes.io/projected/9da9f3f1-6eff-43b8-bc20-34f3a15630ed-kube-api-access-tmbgb\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.881658 4475 generic.go:334] "Generic (PLEG): container finished" podID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" containerID="207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e" exitCode=0 Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.881798 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjmtm" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.881799 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjmtm" event={"ID":"9da9f3f1-6eff-43b8-bc20-34f3a15630ed","Type":"ContainerDied","Data":"207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e"} Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.883334 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjmtm" event={"ID":"9da9f3f1-6eff-43b8-bc20-34f3a15630ed","Type":"ContainerDied","Data":"e3843fcdaebdf5bdd994400d0e00ab52ffb3cee52293abc41d485c373bcc059e"} Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.883369 4475 scope.go:117] "RemoveContainer" containerID="207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.927375 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjmtm"] Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.934035 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjmtm"] Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.936489 4475 scope.go:117] "RemoveContainer" containerID="319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.959094 4475 scope.go:117] "RemoveContainer" containerID="d06fa44d686518232d7f08c85984c89caaabb9e2720972dd63912e97db9a8061" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.991557 4475 scope.go:117] "RemoveContainer" containerID="207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e" Dec 03 08:45:12 crc kubenswrapper[4475]: E1203 08:45:12.992037 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e\": container with ID starting with 207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e not found: ID does not exist" containerID="207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.992095 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e"} err="failed to get container status \"207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e\": rpc error: code = NotFound desc = could not find container \"207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e\": container with ID starting with 207412309d03d76bc5478f95d7cc1a7691441879a326d8c549f80c940ddf1d9e not found: ID does not exist" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.992131 4475 scope.go:117] "RemoveContainer" containerID="319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898" Dec 03 08:45:12 crc kubenswrapper[4475]: E1203 08:45:12.992674 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898\": container with ID starting with 319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898 not found: ID does not exist" containerID="319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.992705 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898"} err="failed to get container status \"319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898\": rpc error: code = NotFound desc = could not find container \"319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898\": container with ID starting with 319b3486ec945b0b593fb09d20b636bb89cb68df76b0dce7fbcf5359c5d80898 not found: ID does not exist" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.992726 4475 scope.go:117] "RemoveContainer" containerID="d06fa44d686518232d7f08c85984c89caaabb9e2720972dd63912e97db9a8061" Dec 03 08:45:12 crc kubenswrapper[4475]: E1203 08:45:12.993136 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d06fa44d686518232d7f08c85984c89caaabb9e2720972dd63912e97db9a8061\": container with ID starting with d06fa44d686518232d7f08c85984c89caaabb9e2720972dd63912e97db9a8061 not found: ID does not exist" containerID="d06fa44d686518232d7f08c85984c89caaabb9e2720972dd63912e97db9a8061" Dec 03 08:45:12 crc kubenswrapper[4475]: I1203 08:45:12.993178 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06fa44d686518232d7f08c85984c89caaabb9e2720972dd63912e97db9a8061"} err="failed to get container status \"d06fa44d686518232d7f08c85984c89caaabb9e2720972dd63912e97db9a8061\": rpc error: code = NotFound desc = could not find container \"d06fa44d686518232d7f08c85984c89caaabb9e2720972dd63912e97db9a8061\": container with ID starting with d06fa44d686518232d7f08c85984c89caaabb9e2720972dd63912e97db9a8061 not found: ID does not exist" Dec 03 08:45:13 crc kubenswrapper[4475]: I1203 08:45:13.500846 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" path="/var/lib/kubelet/pods/9da9f3f1-6eff-43b8-bc20-34f3a15630ed/volumes" Dec 03 08:45:28 crc kubenswrapper[4475]: I1203 08:45:28.934102 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:45:28 crc kubenswrapper[4475]: I1203 08:45:28.934735 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:45:31 crc kubenswrapper[4475]: I1203 08:45:31.126582 4475 scope.go:117] "RemoveContainer" containerID="6e59452431e50843d5c96a8044c8f897bc8c3f8f85a1bd135150d1d548e6d7e2" Dec 03 08:45:58 crc kubenswrapper[4475]: I1203 08:45:58.933380 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:45:58 crc kubenswrapper[4475]: I1203 08:45:58.935138 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:45:58 crc kubenswrapper[4475]: I1203 08:45:58.935309 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 08:45:58 crc kubenswrapper[4475]: I1203 08:45:58.936555 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14097464da453782e9a13b91bd60d4c37dba9b7e953f3de436197834349ae001"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:45:58 crc kubenswrapper[4475]: I1203 08:45:58.936694 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://14097464da453782e9a13b91bd60d4c37dba9b7e953f3de436197834349ae001" gracePeriod=600 Dec 03 08:45:59 crc kubenswrapper[4475]: I1203 08:45:59.386754 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="14097464da453782e9a13b91bd60d4c37dba9b7e953f3de436197834349ae001" exitCode=0 Dec 03 08:45:59 crc kubenswrapper[4475]: I1203 08:45:59.386844 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"14097464da453782e9a13b91bd60d4c37dba9b7e953f3de436197834349ae001"} Dec 03 08:45:59 crc kubenswrapper[4475]: I1203 08:45:59.387133 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479"} Dec 03 08:45:59 crc kubenswrapper[4475]: I1203 08:45:59.387161 4475 scope.go:117] "RemoveContainer" containerID="83104d69007c1efee0ad3ffc002e16539780626460002ad78672b69cf7a2e1ec" Dec 03 08:47:10 crc kubenswrapper[4475]: E1203 08:47:10.080512 4475 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.177:44990->192.168.25.177:40263: write tcp 192.168.25.177:44990->192.168.25.177:40263: write: broken pipe Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.191397 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7494f8796f-mnl82"] Dec 03 08:47:21 crc kubenswrapper[4475]: E1203 08:47:21.192601 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" containerName="extract-utilities" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.192619 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" containerName="extract-utilities" Dec 03 08:47:21 crc kubenswrapper[4475]: E1203 08:47:21.192647 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" containerName="extract-content" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.192653 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" containerName="extract-content" Dec 03 08:47:21 crc kubenswrapper[4475]: E1203 08:47:21.192669 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be454d29-1abe-487c-a609-d6c2e5563726" containerName="collect-profiles" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.192675 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="be454d29-1abe-487c-a609-d6c2e5563726" containerName="collect-profiles" Dec 03 08:47:21 crc kubenswrapper[4475]: E1203 08:47:21.192686 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" containerName="registry-server" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.192693 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" containerName="registry-server" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.192923 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da9f3f1-6eff-43b8-bc20-34f3a15630ed" containerName="registry-server" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.192944 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="be454d29-1abe-487c-a609-d6c2e5563726" containerName="collect-profiles" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.194264 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.219061 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7494f8796f-mnl82"] Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.295331 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-combined-ca-bundle\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.295492 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-internal-tls-certs\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.295912 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-config\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.296031 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-httpd-config\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.296114 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-ovndb-tls-certs\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.296218 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v8bc\" (UniqueName: \"kubernetes.io/projected/af078208-3ee6-4330-8601-a7b588056fb9-kube-api-access-7v8bc\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.296270 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-public-tls-certs\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.397426 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v8bc\" (UniqueName: \"kubernetes.io/projected/af078208-3ee6-4330-8601-a7b588056fb9-kube-api-access-7v8bc\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.397486 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-public-tls-certs\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.397514 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-combined-ca-bundle\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.397564 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-internal-tls-certs\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.397637 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-config\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.397662 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-httpd-config\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.397697 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-ovndb-tls-certs\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.406525 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-ovndb-tls-certs\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.407042 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-internal-tls-certs\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.409976 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-config\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.416522 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-combined-ca-bundle\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.420410 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-public-tls-certs\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.423894 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af078208-3ee6-4330-8601-a7b588056fb9-httpd-config\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.424577 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v8bc\" (UniqueName: \"kubernetes.io/projected/af078208-3ee6-4330-8601-a7b588056fb9-kube-api-access-7v8bc\") pod \"neutron-7494f8796f-mnl82\" (UID: \"af078208-3ee6-4330-8601-a7b588056fb9\") " pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:21 crc kubenswrapper[4475]: I1203 08:47:21.530855 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:22 crc kubenswrapper[4475]: I1203 08:47:22.277509 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7494f8796f-mnl82"] Dec 03 08:47:23 crc kubenswrapper[4475]: I1203 08:47:23.197321 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7494f8796f-mnl82" event={"ID":"af078208-3ee6-4330-8601-a7b588056fb9","Type":"ContainerStarted","Data":"9787499dba6e96298bbfeed7777b90123f0f9f27db03aa5abbe9b275c1b59fee"} Dec 03 08:47:23 crc kubenswrapper[4475]: I1203 08:47:23.199542 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7494f8796f-mnl82" event={"ID":"af078208-3ee6-4330-8601-a7b588056fb9","Type":"ContainerStarted","Data":"5669748b10587fcdd13dcf7c141403a8cf25fe57386dd470812febe28df7dc62"} Dec 03 08:47:23 crc kubenswrapper[4475]: I1203 08:47:23.199721 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7494f8796f-mnl82" event={"ID":"af078208-3ee6-4330-8601-a7b588056fb9","Type":"ContainerStarted","Data":"88df09e87ec014df8b6fcc13d45a46ea7ca67943934b1566db6ed81b317c90bd"} Dec 03 08:47:23 crc kubenswrapper[4475]: I1203 08:47:23.199859 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:23 crc kubenswrapper[4475]: I1203 08:47:23.234615 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7494f8796f-mnl82" podStartSLOduration=2.234602873 podStartE2EDuration="2.234602873s" podCreationTimestamp="2025-12-03 08:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:47:23.217869479 +0000 UTC m=+7328.022767814" watchObservedRunningTime="2025-12-03 08:47:23.234602873 +0000 UTC m=+7328.039501207" Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.235693 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5ckd4"] Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.239424 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.247567 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ckd4"] Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.320942 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-catalog-content\") pod \"certified-operators-5ckd4\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.321119 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-utilities\") pod \"certified-operators-5ckd4\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.321475 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr755\" (UniqueName: \"kubernetes.io/projected/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-kube-api-access-vr755\") pod \"certified-operators-5ckd4\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.423506 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-catalog-content\") pod \"certified-operators-5ckd4\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.423765 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-utilities\") pod \"certified-operators-5ckd4\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.424238 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr755\" (UniqueName: \"kubernetes.io/projected/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-kube-api-access-vr755\") pod \"certified-operators-5ckd4\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.424527 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-catalog-content\") pod \"certified-operators-5ckd4\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.425108 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-utilities\") pod \"certified-operators-5ckd4\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.455379 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr755\" (UniqueName: \"kubernetes.io/projected/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-kube-api-access-vr755\") pod \"certified-operators-5ckd4\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:30 crc kubenswrapper[4475]: I1203 08:47:30.564200 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:31 crc kubenswrapper[4475]: I1203 08:47:31.218630 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ckd4"] Dec 03 08:47:31 crc kubenswrapper[4475]: I1203 08:47:31.273609 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ckd4" event={"ID":"0d8db607-baef-4cf3-ab67-2e6ea6b392ed","Type":"ContainerStarted","Data":"1a476addec7248f1639221918a5cab49ad47bef84b2a2a82d152eada6a7ebaf9"} Dec 03 08:47:32 crc kubenswrapper[4475]: I1203 08:47:32.283222 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ckd4" event={"ID":"0d8db607-baef-4cf3-ab67-2e6ea6b392ed","Type":"ContainerDied","Data":"ba66f50016ddef317b5d1e3bd465d0614c10836fe251901a9d0e847e3870e015"} Dec 03 08:47:32 crc kubenswrapper[4475]: I1203 08:47:32.283291 4475 generic.go:334] "Generic (PLEG): container finished" podID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" containerID="ba66f50016ddef317b5d1e3bd465d0614c10836fe251901a9d0e847e3870e015" exitCode=0 Dec 03 08:47:34 crc kubenswrapper[4475]: I1203 08:47:34.303991 4475 generic.go:334] "Generic (PLEG): container finished" podID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" containerID="bfa747459d0a7473145923adae753b97425887077fc3884b4e06d3bf581b2959" exitCode=0 Dec 03 08:47:34 crc kubenswrapper[4475]: I1203 08:47:34.304496 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ckd4" event={"ID":"0d8db607-baef-4cf3-ab67-2e6ea6b392ed","Type":"ContainerDied","Data":"bfa747459d0a7473145923adae753b97425887077fc3884b4e06d3bf581b2959"} Dec 03 08:47:35 crc kubenswrapper[4475]: I1203 08:47:35.317143 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ckd4" event={"ID":"0d8db607-baef-4cf3-ab67-2e6ea6b392ed","Type":"ContainerStarted","Data":"d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6"} Dec 03 08:47:35 crc kubenswrapper[4475]: I1203 08:47:35.346523 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5ckd4" podStartSLOduration=2.786954299 podStartE2EDuration="5.34454133s" podCreationTimestamp="2025-12-03 08:47:30 +0000 UTC" firstStartedPulling="2025-12-03 08:47:32.284999863 +0000 UTC m=+7337.089898197" lastFinishedPulling="2025-12-03 08:47:34.842586895 +0000 UTC m=+7339.647485228" observedRunningTime="2025-12-03 08:47:35.33532436 +0000 UTC m=+7340.140222693" watchObservedRunningTime="2025-12-03 08:47:35.34454133 +0000 UTC m=+7340.149439664" Dec 03 08:47:40 crc kubenswrapper[4475]: I1203 08:47:40.566034 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:40 crc kubenswrapper[4475]: I1203 08:47:40.567412 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:40 crc kubenswrapper[4475]: I1203 08:47:40.617192 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:41 crc kubenswrapper[4475]: I1203 08:47:41.412381 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:41 crc kubenswrapper[4475]: I1203 08:47:41.456524 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ckd4"] Dec 03 08:47:43 crc kubenswrapper[4475]: I1203 08:47:43.397665 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5ckd4" podUID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" containerName="registry-server" containerID="cri-o://d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6" gracePeriod=2 Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.089035 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.248703 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-utilities\") pod \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.248853 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-catalog-content\") pod \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.248908 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr755\" (UniqueName: \"kubernetes.io/projected/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-kube-api-access-vr755\") pod \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\" (UID: \"0d8db607-baef-4cf3-ab67-2e6ea6b392ed\") " Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.251167 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-utilities" (OuterVolumeSpecName: "utilities") pod "0d8db607-baef-4cf3-ab67-2e6ea6b392ed" (UID: "0d8db607-baef-4cf3-ab67-2e6ea6b392ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.261947 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-kube-api-access-vr755" (OuterVolumeSpecName: "kube-api-access-vr755") pod "0d8db607-baef-4cf3-ab67-2e6ea6b392ed" (UID: "0d8db607-baef-4cf3-ab67-2e6ea6b392ed"). InnerVolumeSpecName "kube-api-access-vr755". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.289280 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d8db607-baef-4cf3-ab67-2e6ea6b392ed" (UID: "0d8db607-baef-4cf3-ab67-2e6ea6b392ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.352860 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.352971 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.353039 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr755\" (UniqueName: \"kubernetes.io/projected/0d8db607-baef-4cf3-ab67-2e6ea6b392ed-kube-api-access-vr755\") on node \"crc\" DevicePath \"\"" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.404781 4475 generic.go:334] "Generic (PLEG): container finished" podID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" containerID="d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6" exitCode=0 Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.404871 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ckd4" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.404882 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ckd4" event={"ID":"0d8db607-baef-4cf3-ab67-2e6ea6b392ed","Type":"ContainerDied","Data":"d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6"} Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.405962 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ckd4" event={"ID":"0d8db607-baef-4cf3-ab67-2e6ea6b392ed","Type":"ContainerDied","Data":"1a476addec7248f1639221918a5cab49ad47bef84b2a2a82d152eada6a7ebaf9"} Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.406398 4475 scope.go:117] "RemoveContainer" containerID="d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.437148 4475 scope.go:117] "RemoveContainer" containerID="bfa747459d0a7473145923adae753b97425887077fc3884b4e06d3bf581b2959" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.445907 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ckd4"] Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.459128 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5ckd4"] Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.481204 4475 scope.go:117] "RemoveContainer" containerID="ba66f50016ddef317b5d1e3bd465d0614c10836fe251901a9d0e847e3870e015" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.515602 4475 scope.go:117] "RemoveContainer" containerID="d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6" Dec 03 08:47:44 crc kubenswrapper[4475]: E1203 08:47:44.518492 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6\": container with ID starting with d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6 not found: ID does not exist" containerID="d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.519018 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6"} err="failed to get container status \"d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6\": rpc error: code = NotFound desc = could not find container \"d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6\": container with ID starting with d58e428bbd44b6a9f96236af7380af860645b81cd4f8b447fcfb3c95a9df90b6 not found: ID does not exist" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.519055 4475 scope.go:117] "RemoveContainer" containerID="bfa747459d0a7473145923adae753b97425887077fc3884b4e06d3bf581b2959" Dec 03 08:47:44 crc kubenswrapper[4475]: E1203 08:47:44.519405 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa747459d0a7473145923adae753b97425887077fc3884b4e06d3bf581b2959\": container with ID starting with bfa747459d0a7473145923adae753b97425887077fc3884b4e06d3bf581b2959 not found: ID does not exist" containerID="bfa747459d0a7473145923adae753b97425887077fc3884b4e06d3bf581b2959" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.519439 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa747459d0a7473145923adae753b97425887077fc3884b4e06d3bf581b2959"} err="failed to get container status \"bfa747459d0a7473145923adae753b97425887077fc3884b4e06d3bf581b2959\": rpc error: code = NotFound desc = could not find container \"bfa747459d0a7473145923adae753b97425887077fc3884b4e06d3bf581b2959\": container with ID starting with bfa747459d0a7473145923adae753b97425887077fc3884b4e06d3bf581b2959 not found: ID does not exist" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.519510 4475 scope.go:117] "RemoveContainer" containerID="ba66f50016ddef317b5d1e3bd465d0614c10836fe251901a9d0e847e3870e015" Dec 03 08:47:44 crc kubenswrapper[4475]: E1203 08:47:44.520014 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba66f50016ddef317b5d1e3bd465d0614c10836fe251901a9d0e847e3870e015\": container with ID starting with ba66f50016ddef317b5d1e3bd465d0614c10836fe251901a9d0e847e3870e015 not found: ID does not exist" containerID="ba66f50016ddef317b5d1e3bd465d0614c10836fe251901a9d0e847e3870e015" Dec 03 08:47:44 crc kubenswrapper[4475]: I1203 08:47:44.520043 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba66f50016ddef317b5d1e3bd465d0614c10836fe251901a9d0e847e3870e015"} err="failed to get container status \"ba66f50016ddef317b5d1e3bd465d0614c10836fe251901a9d0e847e3870e015\": rpc error: code = NotFound desc = could not find container \"ba66f50016ddef317b5d1e3bd465d0614c10836fe251901a9d0e847e3870e015\": container with ID starting with ba66f50016ddef317b5d1e3bd465d0614c10836fe251901a9d0e847e3870e015 not found: ID does not exist" Dec 03 08:47:45 crc kubenswrapper[4475]: I1203 08:47:45.501185 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" path="/var/lib/kubelet/pods/0d8db607-baef-4cf3-ab67-2e6ea6b392ed/volumes" Dec 03 08:47:51 crc kubenswrapper[4475]: I1203 08:47:51.548886 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7494f8796f-mnl82" Dec 03 08:47:51 crc kubenswrapper[4475]: I1203 08:47:51.629934 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6764df74d9-l26gg"] Dec 03 08:47:51 crc kubenswrapper[4475]: I1203 08:47:51.630905 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6764df74d9-l26gg" podUID="356706b5-1a23-42f9-bbad-78dcc26dbddd" containerName="neutron-httpd" containerID="cri-o://76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503" gracePeriod=30 Dec 03 08:47:51 crc kubenswrapper[4475]: I1203 08:47:51.630852 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6764df74d9-l26gg" podUID="356706b5-1a23-42f9-bbad-78dcc26dbddd" containerName="neutron-api" containerID="cri-o://3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a" gracePeriod=30 Dec 03 08:47:52 crc kubenswrapper[4475]: I1203 08:47:52.495171 4475 generic.go:334] "Generic (PLEG): container finished" podID="356706b5-1a23-42f9-bbad-78dcc26dbddd" containerID="76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503" exitCode=0 Dec 03 08:47:52 crc kubenswrapper[4475]: I1203 08:47:52.495251 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6764df74d9-l26gg" event={"ID":"356706b5-1a23-42f9-bbad-78dcc26dbddd","Type":"ContainerDied","Data":"76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503"} Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.169113 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.354700 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-ovndb-tls-certs\") pod \"356706b5-1a23-42f9-bbad-78dcc26dbddd\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.355061 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-public-tls-certs\") pod \"356706b5-1a23-42f9-bbad-78dcc26dbddd\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.355305 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-config\") pod \"356706b5-1a23-42f9-bbad-78dcc26dbddd\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.355377 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzfnc\" (UniqueName: \"kubernetes.io/projected/356706b5-1a23-42f9-bbad-78dcc26dbddd-kube-api-access-bzfnc\") pod \"356706b5-1a23-42f9-bbad-78dcc26dbddd\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.355500 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-internal-tls-certs\") pod \"356706b5-1a23-42f9-bbad-78dcc26dbddd\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.355624 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-httpd-config\") pod \"356706b5-1a23-42f9-bbad-78dcc26dbddd\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.355713 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-combined-ca-bundle\") pod \"356706b5-1a23-42f9-bbad-78dcc26dbddd\" (UID: \"356706b5-1a23-42f9-bbad-78dcc26dbddd\") " Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.380910 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/356706b5-1a23-42f9-bbad-78dcc26dbddd-kube-api-access-bzfnc" (OuterVolumeSpecName: "kube-api-access-bzfnc") pod "356706b5-1a23-42f9-bbad-78dcc26dbddd" (UID: "356706b5-1a23-42f9-bbad-78dcc26dbddd"). InnerVolumeSpecName "kube-api-access-bzfnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.383324 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "356706b5-1a23-42f9-bbad-78dcc26dbddd" (UID: "356706b5-1a23-42f9-bbad-78dcc26dbddd"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.410136 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "356706b5-1a23-42f9-bbad-78dcc26dbddd" (UID: "356706b5-1a23-42f9-bbad-78dcc26dbddd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.410848 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "356706b5-1a23-42f9-bbad-78dcc26dbddd" (UID: "356706b5-1a23-42f9-bbad-78dcc26dbddd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.415186 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-config" (OuterVolumeSpecName: "config") pod "356706b5-1a23-42f9-bbad-78dcc26dbddd" (UID: "356706b5-1a23-42f9-bbad-78dcc26dbddd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.431170 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "356706b5-1a23-42f9-bbad-78dcc26dbddd" (UID: "356706b5-1a23-42f9-bbad-78dcc26dbddd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.433708 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "356706b5-1a23-42f9-bbad-78dcc26dbddd" (UID: "356706b5-1a23-42f9-bbad-78dcc26dbddd"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.459048 4475 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.459074 4475 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.459084 4475 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.459097 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzfnc\" (UniqueName: \"kubernetes.io/projected/356706b5-1a23-42f9-bbad-78dcc26dbddd-kube-api-access-bzfnc\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.459108 4475 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.459117 4475 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.459126 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356706b5-1a23-42f9-bbad-78dcc26dbddd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.575914 4475 generic.go:334] "Generic (PLEG): container finished" podID="356706b5-1a23-42f9-bbad-78dcc26dbddd" containerID="3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a" exitCode=0 Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.575968 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6764df74d9-l26gg" event={"ID":"356706b5-1a23-42f9-bbad-78dcc26dbddd","Type":"ContainerDied","Data":"3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a"} Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.576003 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6764df74d9-l26gg" event={"ID":"356706b5-1a23-42f9-bbad-78dcc26dbddd","Type":"ContainerDied","Data":"25216b2f8bb18f50773fada67aa203dce5bb55e3be4b70440318054660945fa0"} Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.576021 4475 scope.go:117] "RemoveContainer" containerID="76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.576159 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6764df74d9-l26gg" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.602656 4475 scope.go:117] "RemoveContainer" containerID="3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.611564 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6764df74d9-l26gg"] Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.617039 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6764df74d9-l26gg"] Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.632880 4475 scope.go:117] "RemoveContainer" containerID="76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503" Dec 03 08:48:00 crc kubenswrapper[4475]: E1203 08:48:00.633358 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503\": container with ID starting with 76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503 not found: ID does not exist" containerID="76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.633401 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503"} err="failed to get container status \"76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503\": rpc error: code = NotFound desc = could not find container \"76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503\": container with ID starting with 76f0165daae81d1d200744fdd06739e5b0a60a8e39e871564e858064ad07d503 not found: ID does not exist" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.633446 4475 scope.go:117] "RemoveContainer" containerID="3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a" Dec 03 08:48:00 crc kubenswrapper[4475]: E1203 08:48:00.633798 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a\": container with ID starting with 3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a not found: ID does not exist" containerID="3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a" Dec 03 08:48:00 crc kubenswrapper[4475]: I1203 08:48:00.633831 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a"} err="failed to get container status \"3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a\": rpc error: code = NotFound desc = could not find container \"3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a\": container with ID starting with 3b4a8d0acbe337d3cc99d3dcd0afa6d9c1a8719f74cff3ee84b5aef4f73aff1a not found: ID does not exist" Dec 03 08:48:01 crc kubenswrapper[4475]: I1203 08:48:01.502061 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="356706b5-1a23-42f9-bbad-78dcc26dbddd" path="/var/lib/kubelet/pods/356706b5-1a23-42f9-bbad-78dcc26dbddd/volumes" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.912078 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vrhq8"] Dec 03 08:48:25 crc kubenswrapper[4475]: E1203 08:48:25.913236 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" containerName="extract-utilities" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.913255 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" containerName="extract-utilities" Dec 03 08:48:25 crc kubenswrapper[4475]: E1203 08:48:25.913276 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356706b5-1a23-42f9-bbad-78dcc26dbddd" containerName="neutron-httpd" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.913282 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="356706b5-1a23-42f9-bbad-78dcc26dbddd" containerName="neutron-httpd" Dec 03 08:48:25 crc kubenswrapper[4475]: E1203 08:48:25.913306 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" containerName="registry-server" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.913312 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" containerName="registry-server" Dec 03 08:48:25 crc kubenswrapper[4475]: E1203 08:48:25.913327 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" containerName="extract-content" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.913333 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" containerName="extract-content" Dec 03 08:48:25 crc kubenswrapper[4475]: E1203 08:48:25.913347 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356706b5-1a23-42f9-bbad-78dcc26dbddd" containerName="neutron-api" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.913353 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="356706b5-1a23-42f9-bbad-78dcc26dbddd" containerName="neutron-api" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.913569 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="356706b5-1a23-42f9-bbad-78dcc26dbddd" containerName="neutron-httpd" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.913581 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="356706b5-1a23-42f9-bbad-78dcc26dbddd" containerName="neutron-api" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.913589 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d8db607-baef-4cf3-ab67-2e6ea6b392ed" containerName="registry-server" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.915117 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.927394 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vrhq8"] Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.989050 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-utilities\") pod \"community-operators-vrhq8\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.989188 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-catalog-content\") pod \"community-operators-vrhq8\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:25 crc kubenswrapper[4475]: I1203 08:48:25.989237 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcbhq\" (UniqueName: \"kubernetes.io/projected/86bea7dc-8747-49ec-8d56-55f3d25af692-kube-api-access-wcbhq\") pod \"community-operators-vrhq8\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:26 crc kubenswrapper[4475]: I1203 08:48:26.091583 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-utilities\") pod \"community-operators-vrhq8\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:26 crc kubenswrapper[4475]: I1203 08:48:26.092001 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-catalog-content\") pod \"community-operators-vrhq8\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:26 crc kubenswrapper[4475]: I1203 08:48:26.092119 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcbhq\" (UniqueName: \"kubernetes.io/projected/86bea7dc-8747-49ec-8d56-55f3d25af692-kube-api-access-wcbhq\") pod \"community-operators-vrhq8\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:26 crc kubenswrapper[4475]: I1203 08:48:26.092125 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-utilities\") pod \"community-operators-vrhq8\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:26 crc kubenswrapper[4475]: I1203 08:48:26.092356 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-catalog-content\") pod \"community-operators-vrhq8\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:26 crc kubenswrapper[4475]: I1203 08:48:26.115328 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcbhq\" (UniqueName: \"kubernetes.io/projected/86bea7dc-8747-49ec-8d56-55f3d25af692-kube-api-access-wcbhq\") pod \"community-operators-vrhq8\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:26 crc kubenswrapper[4475]: I1203 08:48:26.245418 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:27 crc kubenswrapper[4475]: I1203 08:48:27.009466 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vrhq8"] Dec 03 08:48:27 crc kubenswrapper[4475]: I1203 08:48:27.827879 4475 generic.go:334] "Generic (PLEG): container finished" podID="86bea7dc-8747-49ec-8d56-55f3d25af692" containerID="11a89ef7465c4a5cee9c5046d85c08cbcbd95866c1f42bcc347d4728ca1dc4f1" exitCode=0 Dec 03 08:48:27 crc kubenswrapper[4475]: I1203 08:48:27.828046 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrhq8" event={"ID":"86bea7dc-8747-49ec-8d56-55f3d25af692","Type":"ContainerDied","Data":"11a89ef7465c4a5cee9c5046d85c08cbcbd95866c1f42bcc347d4728ca1dc4f1"} Dec 03 08:48:27 crc kubenswrapper[4475]: I1203 08:48:27.828230 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrhq8" event={"ID":"86bea7dc-8747-49ec-8d56-55f3d25af692","Type":"ContainerStarted","Data":"4f3fdbbab00f7c05e760cd9ec603b06d7a86d99bcee7e26ba0ef732746a84893"} Dec 03 08:48:28 crc kubenswrapper[4475]: I1203 08:48:28.933720 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:48:28 crc kubenswrapper[4475]: I1203 08:48:28.934766 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:48:29 crc kubenswrapper[4475]: I1203 08:48:29.851104 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrhq8" event={"ID":"86bea7dc-8747-49ec-8d56-55f3d25af692","Type":"ContainerStarted","Data":"7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7"} Dec 03 08:48:30 crc kubenswrapper[4475]: I1203 08:48:30.861360 4475 generic.go:334] "Generic (PLEG): container finished" podID="86bea7dc-8747-49ec-8d56-55f3d25af692" containerID="7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7" exitCode=0 Dec 03 08:48:30 crc kubenswrapper[4475]: I1203 08:48:30.861413 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrhq8" event={"ID":"86bea7dc-8747-49ec-8d56-55f3d25af692","Type":"ContainerDied","Data":"7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7"} Dec 03 08:48:31 crc kubenswrapper[4475]: I1203 08:48:31.873489 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrhq8" event={"ID":"86bea7dc-8747-49ec-8d56-55f3d25af692","Type":"ContainerStarted","Data":"f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa"} Dec 03 08:48:31 crc kubenswrapper[4475]: I1203 08:48:31.894763 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vrhq8" podStartSLOduration=3.353114862 podStartE2EDuration="6.894731386s" podCreationTimestamp="2025-12-03 08:48:25 +0000 UTC" firstStartedPulling="2025-12-03 08:48:27.830091161 +0000 UTC m=+7392.634989495" lastFinishedPulling="2025-12-03 08:48:31.371707685 +0000 UTC m=+7396.176606019" observedRunningTime="2025-12-03 08:48:31.890180258 +0000 UTC m=+7396.695078592" watchObservedRunningTime="2025-12-03 08:48:31.894731386 +0000 UTC m=+7396.699629709" Dec 03 08:48:36 crc kubenswrapper[4475]: I1203 08:48:36.246011 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:36 crc kubenswrapper[4475]: I1203 08:48:36.246687 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:36 crc kubenswrapper[4475]: I1203 08:48:36.302819 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:36 crc kubenswrapper[4475]: I1203 08:48:36.973123 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:37 crc kubenswrapper[4475]: I1203 08:48:37.028261 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vrhq8"] Dec 03 08:48:38 crc kubenswrapper[4475]: I1203 08:48:38.947634 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vrhq8" podUID="86bea7dc-8747-49ec-8d56-55f3d25af692" containerName="registry-server" containerID="cri-o://f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa" gracePeriod=2 Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.449061 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.532911 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcbhq\" (UniqueName: \"kubernetes.io/projected/86bea7dc-8747-49ec-8d56-55f3d25af692-kube-api-access-wcbhq\") pod \"86bea7dc-8747-49ec-8d56-55f3d25af692\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.533396 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-utilities\") pod \"86bea7dc-8747-49ec-8d56-55f3d25af692\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.533520 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-catalog-content\") pod \"86bea7dc-8747-49ec-8d56-55f3d25af692\" (UID: \"86bea7dc-8747-49ec-8d56-55f3d25af692\") " Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.534350 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-utilities" (OuterVolumeSpecName: "utilities") pod "86bea7dc-8747-49ec-8d56-55f3d25af692" (UID: "86bea7dc-8747-49ec-8d56-55f3d25af692"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.534971 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.546177 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86bea7dc-8747-49ec-8d56-55f3d25af692-kube-api-access-wcbhq" (OuterVolumeSpecName: "kube-api-access-wcbhq") pod "86bea7dc-8747-49ec-8d56-55f3d25af692" (UID: "86bea7dc-8747-49ec-8d56-55f3d25af692"). InnerVolumeSpecName "kube-api-access-wcbhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.608640 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86bea7dc-8747-49ec-8d56-55f3d25af692" (UID: "86bea7dc-8747-49ec-8d56-55f3d25af692"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.636566 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcbhq\" (UniqueName: \"kubernetes.io/projected/86bea7dc-8747-49ec-8d56-55f3d25af692-kube-api-access-wcbhq\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.636608 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86bea7dc-8747-49ec-8d56-55f3d25af692-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.962614 4475 generic.go:334] "Generic (PLEG): container finished" podID="86bea7dc-8747-49ec-8d56-55f3d25af692" containerID="f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa" exitCode=0 Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.962721 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vrhq8" Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.962754 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrhq8" event={"ID":"86bea7dc-8747-49ec-8d56-55f3d25af692","Type":"ContainerDied","Data":"f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa"} Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.963201 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrhq8" event={"ID":"86bea7dc-8747-49ec-8d56-55f3d25af692","Type":"ContainerDied","Data":"4f3fdbbab00f7c05e760cd9ec603b06d7a86d99bcee7e26ba0ef732746a84893"} Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.963226 4475 scope.go:117] "RemoveContainer" containerID="f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa" Dec 03 08:48:39 crc kubenswrapper[4475]: I1203 08:48:39.994480 4475 scope.go:117] "RemoveContainer" containerID="7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7" Dec 03 08:48:40 crc kubenswrapper[4475]: I1203 08:48:40.012517 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vrhq8"] Dec 03 08:48:40 crc kubenswrapper[4475]: I1203 08:48:40.018350 4475 scope.go:117] "RemoveContainer" containerID="11a89ef7465c4a5cee9c5046d85c08cbcbd95866c1f42bcc347d4728ca1dc4f1" Dec 03 08:48:40 crc kubenswrapper[4475]: I1203 08:48:40.022377 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vrhq8"] Dec 03 08:48:40 crc kubenswrapper[4475]: I1203 08:48:40.064249 4475 scope.go:117] "RemoveContainer" containerID="f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa" Dec 03 08:48:40 crc kubenswrapper[4475]: E1203 08:48:40.067490 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa\": container with ID starting with f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa not found: ID does not exist" containerID="f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa" Dec 03 08:48:40 crc kubenswrapper[4475]: I1203 08:48:40.067622 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa"} err="failed to get container status \"f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa\": rpc error: code = NotFound desc = could not find container \"f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa\": container with ID starting with f08e400a0a9dd9ac3f84dd9e85643ab2117c8c03e14b0fd2a4521bd3cfec7afa not found: ID does not exist" Dec 03 08:48:40 crc kubenswrapper[4475]: I1203 08:48:40.067732 4475 scope.go:117] "RemoveContainer" containerID="7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7" Dec 03 08:48:40 crc kubenswrapper[4475]: E1203 08:48:40.070575 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7\": container with ID starting with 7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7 not found: ID does not exist" containerID="7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7" Dec 03 08:48:40 crc kubenswrapper[4475]: I1203 08:48:40.070629 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7"} err="failed to get container status \"7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7\": rpc error: code = NotFound desc = could not find container \"7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7\": container with ID starting with 7807812cd021e951f7dcfe13e4bfe0ff40293e18b2d81596412da80cc7496ee7 not found: ID does not exist" Dec 03 08:48:40 crc kubenswrapper[4475]: I1203 08:48:40.070663 4475 scope.go:117] "RemoveContainer" containerID="11a89ef7465c4a5cee9c5046d85c08cbcbd95866c1f42bcc347d4728ca1dc4f1" Dec 03 08:48:40 crc kubenswrapper[4475]: E1203 08:48:40.071359 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11a89ef7465c4a5cee9c5046d85c08cbcbd95866c1f42bcc347d4728ca1dc4f1\": container with ID starting with 11a89ef7465c4a5cee9c5046d85c08cbcbd95866c1f42bcc347d4728ca1dc4f1 not found: ID does not exist" containerID="11a89ef7465c4a5cee9c5046d85c08cbcbd95866c1f42bcc347d4728ca1dc4f1" Dec 03 08:48:40 crc kubenswrapper[4475]: I1203 08:48:40.071537 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a89ef7465c4a5cee9c5046d85c08cbcbd95866c1f42bcc347d4728ca1dc4f1"} err="failed to get container status \"11a89ef7465c4a5cee9c5046d85c08cbcbd95866c1f42bcc347d4728ca1dc4f1\": rpc error: code = NotFound desc = could not find container \"11a89ef7465c4a5cee9c5046d85c08cbcbd95866c1f42bcc347d4728ca1dc4f1\": container with ID starting with 11a89ef7465c4a5cee9c5046d85c08cbcbd95866c1f42bcc347d4728ca1dc4f1 not found: ID does not exist" Dec 03 08:48:41 crc kubenswrapper[4475]: I1203 08:48:41.506655 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86bea7dc-8747-49ec-8d56-55f3d25af692" path="/var/lib/kubelet/pods/86bea7dc-8747-49ec-8d56-55f3d25af692/volumes" Dec 03 08:48:58 crc kubenswrapper[4475]: I1203 08:48:58.933774 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:48:58 crc kubenswrapper[4475]: I1203 08:48:58.934480 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:49:28 crc kubenswrapper[4475]: I1203 08:49:28.933855 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:49:28 crc kubenswrapper[4475]: I1203 08:49:28.934600 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:49:28 crc kubenswrapper[4475]: I1203 08:49:28.934655 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 08:49:28 crc kubenswrapper[4475]: I1203 08:49:28.936191 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:49:28 crc kubenswrapper[4475]: I1203 08:49:28.936278 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" gracePeriod=600 Dec 03 08:49:29 crc kubenswrapper[4475]: E1203 08:49:29.082084 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:49:29 crc kubenswrapper[4475]: I1203 08:49:29.442971 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" exitCode=0 Dec 03 08:49:29 crc kubenswrapper[4475]: I1203 08:49:29.443074 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479"} Dec 03 08:49:29 crc kubenswrapper[4475]: I1203 08:49:29.443324 4475 scope.go:117] "RemoveContainer" containerID="14097464da453782e9a13b91bd60d4c37dba9b7e953f3de436197834349ae001" Dec 03 08:49:29 crc kubenswrapper[4475]: I1203 08:49:29.444491 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:49:29 crc kubenswrapper[4475]: E1203 08:49:29.445056 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:49:42 crc kubenswrapper[4475]: I1203 08:49:42.492079 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:49:42 crc kubenswrapper[4475]: E1203 08:49:42.493013 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:49:57 crc kubenswrapper[4475]: I1203 08:49:57.491629 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:49:57 crc kubenswrapper[4475]: E1203 08:49:57.492515 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:50:12 crc kubenswrapper[4475]: I1203 08:50:12.491371 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:50:12 crc kubenswrapper[4475]: E1203 08:50:12.492343 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:50:23 crc kubenswrapper[4475]: I1203 08:50:23.491182 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:50:23 crc kubenswrapper[4475]: E1203 08:50:23.492264 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:50:28 crc kubenswrapper[4475]: I1203 08:50:28.905469 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5rpt6"] Dec 03 08:50:28 crc kubenswrapper[4475]: E1203 08:50:28.909125 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86bea7dc-8747-49ec-8d56-55f3d25af692" containerName="registry-server" Dec 03 08:50:28 crc kubenswrapper[4475]: I1203 08:50:28.909236 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bea7dc-8747-49ec-8d56-55f3d25af692" containerName="registry-server" Dec 03 08:50:28 crc kubenswrapper[4475]: E1203 08:50:28.909365 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86bea7dc-8747-49ec-8d56-55f3d25af692" containerName="extract-content" Dec 03 08:50:28 crc kubenswrapper[4475]: I1203 08:50:28.909378 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bea7dc-8747-49ec-8d56-55f3d25af692" containerName="extract-content" Dec 03 08:50:28 crc kubenswrapper[4475]: E1203 08:50:28.909521 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86bea7dc-8747-49ec-8d56-55f3d25af692" containerName="extract-utilities" Dec 03 08:50:28 crc kubenswrapper[4475]: I1203 08:50:28.909532 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bea7dc-8747-49ec-8d56-55f3d25af692" containerName="extract-utilities" Dec 03 08:50:28 crc kubenswrapper[4475]: I1203 08:50:28.910004 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="86bea7dc-8747-49ec-8d56-55f3d25af692" containerName="registry-server" Dec 03 08:50:28 crc kubenswrapper[4475]: I1203 08:50:28.913251 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:28 crc kubenswrapper[4475]: I1203 08:50:28.917219 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rpt6"] Dec 03 08:50:29 crc kubenswrapper[4475]: I1203 08:50:29.034682 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-utilities\") pod \"redhat-operators-5rpt6\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:29 crc kubenswrapper[4475]: I1203 08:50:29.034997 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54pmd\" (UniqueName: \"kubernetes.io/projected/3ae7195f-7a46-46fc-80e4-442917745ffe-kube-api-access-54pmd\") pod \"redhat-operators-5rpt6\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:29 crc kubenswrapper[4475]: I1203 08:50:29.035075 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-catalog-content\") pod \"redhat-operators-5rpt6\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:29 crc kubenswrapper[4475]: I1203 08:50:29.138611 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-utilities\") pod \"redhat-operators-5rpt6\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:29 crc kubenswrapper[4475]: I1203 08:50:29.138706 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54pmd\" (UniqueName: \"kubernetes.io/projected/3ae7195f-7a46-46fc-80e4-442917745ffe-kube-api-access-54pmd\") pod \"redhat-operators-5rpt6\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:29 crc kubenswrapper[4475]: I1203 08:50:29.138734 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-catalog-content\") pod \"redhat-operators-5rpt6\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:29 crc kubenswrapper[4475]: I1203 08:50:29.139872 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-catalog-content\") pod \"redhat-operators-5rpt6\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:29 crc kubenswrapper[4475]: I1203 08:50:29.140081 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-utilities\") pod \"redhat-operators-5rpt6\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:29 crc kubenswrapper[4475]: I1203 08:50:29.160371 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54pmd\" (UniqueName: \"kubernetes.io/projected/3ae7195f-7a46-46fc-80e4-442917745ffe-kube-api-access-54pmd\") pod \"redhat-operators-5rpt6\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:29 crc kubenswrapper[4475]: I1203 08:50:29.235825 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:29 crc kubenswrapper[4475]: I1203 08:50:29.782654 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rpt6"] Dec 03 08:50:30 crc kubenswrapper[4475]: I1203 08:50:30.050787 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rpt6" event={"ID":"3ae7195f-7a46-46fc-80e4-442917745ffe","Type":"ContainerStarted","Data":"1c095a499b9e84f39e6f40b0a6c3e542e67a942942907f5b1cf05bba32c527f3"} Dec 03 08:50:31 crc kubenswrapper[4475]: I1203 08:50:31.061953 4475 generic.go:334] "Generic (PLEG): container finished" podID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerID="ea3f833031001bca8f8e69085f7f40da8661395db07e0efb25ee29470336fb88" exitCode=0 Dec 03 08:50:31 crc kubenswrapper[4475]: I1203 08:50:31.062060 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rpt6" event={"ID":"3ae7195f-7a46-46fc-80e4-442917745ffe","Type":"ContainerDied","Data":"ea3f833031001bca8f8e69085f7f40da8661395db07e0efb25ee29470336fb88"} Dec 03 08:50:31 crc kubenswrapper[4475]: I1203 08:50:31.066353 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:50:33 crc kubenswrapper[4475]: I1203 08:50:33.093716 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rpt6" event={"ID":"3ae7195f-7a46-46fc-80e4-442917745ffe","Type":"ContainerStarted","Data":"f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81"} Dec 03 08:50:36 crc kubenswrapper[4475]: I1203 08:50:36.149252 4475 generic.go:334] "Generic (PLEG): container finished" podID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerID="f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81" exitCode=0 Dec 03 08:50:36 crc kubenswrapper[4475]: I1203 08:50:36.150165 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rpt6" event={"ID":"3ae7195f-7a46-46fc-80e4-442917745ffe","Type":"ContainerDied","Data":"f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81"} Dec 03 08:50:36 crc kubenswrapper[4475]: I1203 08:50:36.492147 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:50:36 crc kubenswrapper[4475]: E1203 08:50:36.492598 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:50:37 crc kubenswrapper[4475]: I1203 08:50:37.160954 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rpt6" event={"ID":"3ae7195f-7a46-46fc-80e4-442917745ffe","Type":"ContainerStarted","Data":"4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef"} Dec 03 08:50:37 crc kubenswrapper[4475]: I1203 08:50:37.206399 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5rpt6" podStartSLOduration=3.567745128 podStartE2EDuration="9.205760108s" podCreationTimestamp="2025-12-03 08:50:28 +0000 UTC" firstStartedPulling="2025-12-03 08:50:31.065595576 +0000 UTC m=+7515.870493910" lastFinishedPulling="2025-12-03 08:50:36.703610555 +0000 UTC m=+7521.508508890" observedRunningTime="2025-12-03 08:50:37.19811334 +0000 UTC m=+7522.003011694" watchObservedRunningTime="2025-12-03 08:50:37.205760108 +0000 UTC m=+7522.010658442" Dec 03 08:50:39 crc kubenswrapper[4475]: I1203 08:50:39.237324 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:39 crc kubenswrapper[4475]: I1203 08:50:39.237795 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:40 crc kubenswrapper[4475]: I1203 08:50:40.293053 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5rpt6" podUID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerName="registry-server" probeResult="failure" output=< Dec 03 08:50:40 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 08:50:40 crc kubenswrapper[4475]: > Dec 03 08:50:49 crc kubenswrapper[4475]: I1203 08:50:49.491545 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:50:49 crc kubenswrapper[4475]: E1203 08:50:49.492505 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:50:50 crc kubenswrapper[4475]: I1203 08:50:50.275712 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5rpt6" podUID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerName="registry-server" probeResult="failure" output=< Dec 03 08:50:50 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 08:50:50 crc kubenswrapper[4475]: > Dec 03 08:50:59 crc kubenswrapper[4475]: I1203 08:50:59.281042 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:50:59 crc kubenswrapper[4475]: I1203 08:50:59.327497 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:51:00 crc kubenswrapper[4475]: I1203 08:51:00.105244 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5rpt6"] Dec 03 08:51:00 crc kubenswrapper[4475]: I1203 08:51:00.381663 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5rpt6" podUID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerName="registry-server" containerID="cri-o://4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef" gracePeriod=2 Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.156533 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.292504 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54pmd\" (UniqueName: \"kubernetes.io/projected/3ae7195f-7a46-46fc-80e4-442917745ffe-kube-api-access-54pmd\") pod \"3ae7195f-7a46-46fc-80e4-442917745ffe\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.293000 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-utilities\") pod \"3ae7195f-7a46-46fc-80e4-442917745ffe\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.293214 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-catalog-content\") pod \"3ae7195f-7a46-46fc-80e4-442917745ffe\" (UID: \"3ae7195f-7a46-46fc-80e4-442917745ffe\") " Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.297679 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-utilities" (OuterVolumeSpecName: "utilities") pod "3ae7195f-7a46-46fc-80e4-442917745ffe" (UID: "3ae7195f-7a46-46fc-80e4-442917745ffe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.307103 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae7195f-7a46-46fc-80e4-442917745ffe-kube-api-access-54pmd" (OuterVolumeSpecName: "kube-api-access-54pmd") pod "3ae7195f-7a46-46fc-80e4-442917745ffe" (UID: "3ae7195f-7a46-46fc-80e4-442917745ffe"). InnerVolumeSpecName "kube-api-access-54pmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.371853 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ae7195f-7a46-46fc-80e4-442917745ffe" (UID: "3ae7195f-7a46-46fc-80e4-442917745ffe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.391808 4475 generic.go:334] "Generic (PLEG): container finished" podID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerID="4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef" exitCode=0 Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.391847 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rpt6" event={"ID":"3ae7195f-7a46-46fc-80e4-442917745ffe","Type":"ContainerDied","Data":"4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef"} Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.391876 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rpt6" event={"ID":"3ae7195f-7a46-46fc-80e4-442917745ffe","Type":"ContainerDied","Data":"1c095a499b9e84f39e6f40b0a6c3e542e67a942942907f5b1cf05bba32c527f3"} Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.391893 4475 scope.go:117] "RemoveContainer" containerID="4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.391887 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rpt6" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.394987 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.395025 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae7195f-7a46-46fc-80e4-442917745ffe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.395036 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54pmd\" (UniqueName: \"kubernetes.io/projected/3ae7195f-7a46-46fc-80e4-442917745ffe-kube-api-access-54pmd\") on node \"crc\" DevicePath \"\"" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.425188 4475 scope.go:117] "RemoveContainer" containerID="f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.430792 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5rpt6"] Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.438255 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5rpt6"] Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.452000 4475 scope.go:117] "RemoveContainer" containerID="ea3f833031001bca8f8e69085f7f40da8661395db07e0efb25ee29470336fb88" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.489833 4475 scope.go:117] "RemoveContainer" containerID="4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.491626 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:51:01 crc kubenswrapper[4475]: E1203 08:51:01.491883 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:51:01 crc kubenswrapper[4475]: E1203 08:51:01.494072 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef\": container with ID starting with 4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef not found: ID does not exist" containerID="4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.494445 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef"} err="failed to get container status \"4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef\": rpc error: code = NotFound desc = could not find container \"4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef\": container with ID starting with 4677426b2572ad3a43822bd38e07c984caa9a038fae8c74321619c3f291ca1ef not found: ID does not exist" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.494492 4475 scope.go:117] "RemoveContainer" containerID="f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81" Dec 03 08:51:01 crc kubenswrapper[4475]: E1203 08:51:01.495021 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81\": container with ID starting with f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81 not found: ID does not exist" containerID="f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.495046 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81"} err="failed to get container status \"f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81\": rpc error: code = NotFound desc = could not find container \"f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81\": container with ID starting with f773fdfe7ec0ed7cf7c8f9db6c4b3e04aaf81f9d04342934dc6e1e220ee72e81 not found: ID does not exist" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.495061 4475 scope.go:117] "RemoveContainer" containerID="ea3f833031001bca8f8e69085f7f40da8661395db07e0efb25ee29470336fb88" Dec 03 08:51:01 crc kubenswrapper[4475]: E1203 08:51:01.495505 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea3f833031001bca8f8e69085f7f40da8661395db07e0efb25ee29470336fb88\": container with ID starting with ea3f833031001bca8f8e69085f7f40da8661395db07e0efb25ee29470336fb88 not found: ID does not exist" containerID="ea3f833031001bca8f8e69085f7f40da8661395db07e0efb25ee29470336fb88" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.495528 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea3f833031001bca8f8e69085f7f40da8661395db07e0efb25ee29470336fb88"} err="failed to get container status \"ea3f833031001bca8f8e69085f7f40da8661395db07e0efb25ee29470336fb88\": rpc error: code = NotFound desc = could not find container \"ea3f833031001bca8f8e69085f7f40da8661395db07e0efb25ee29470336fb88\": container with ID starting with ea3f833031001bca8f8e69085f7f40da8661395db07e0efb25ee29470336fb88 not found: ID does not exist" Dec 03 08:51:01 crc kubenswrapper[4475]: I1203 08:51:01.507636 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae7195f-7a46-46fc-80e4-442917745ffe" path="/var/lib/kubelet/pods/3ae7195f-7a46-46fc-80e4-442917745ffe/volumes" Dec 03 08:51:15 crc kubenswrapper[4475]: I1203 08:51:15.498228 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:51:15 crc kubenswrapper[4475]: E1203 08:51:15.499057 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:51:27 crc kubenswrapper[4475]: I1203 08:51:27.490831 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:51:27 crc kubenswrapper[4475]: E1203 08:51:27.491952 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:51:38 crc kubenswrapper[4475]: I1203 08:51:38.491823 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:51:38 crc kubenswrapper[4475]: E1203 08:51:38.493050 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:51:50 crc kubenswrapper[4475]: I1203 08:51:50.491733 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:51:50 crc kubenswrapper[4475]: E1203 08:51:50.492783 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:52:03 crc kubenswrapper[4475]: I1203 08:52:03.491188 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:52:03 crc kubenswrapper[4475]: E1203 08:52:03.491906 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:52:14 crc kubenswrapper[4475]: I1203 08:52:14.491061 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:52:14 crc kubenswrapper[4475]: E1203 08:52:14.492009 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:52:28 crc kubenswrapper[4475]: I1203 08:52:28.491091 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:52:28 crc kubenswrapper[4475]: E1203 08:52:28.492044 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:52:41 crc kubenswrapper[4475]: I1203 08:52:41.492219 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:52:41 crc kubenswrapper[4475]: E1203 08:52:41.493279 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:52:53 crc kubenswrapper[4475]: I1203 08:52:53.491990 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:52:53 crc kubenswrapper[4475]: E1203 08:52:53.493155 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:53:05 crc kubenswrapper[4475]: I1203 08:53:05.499220 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:53:05 crc kubenswrapper[4475]: E1203 08:53:05.500268 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:53:20 crc kubenswrapper[4475]: I1203 08:53:20.492474 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:53:20 crc kubenswrapper[4475]: E1203 08:53:20.493856 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:53:34 crc kubenswrapper[4475]: I1203 08:53:34.491845 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:53:34 crc kubenswrapper[4475]: E1203 08:53:34.492950 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:53:46 crc kubenswrapper[4475]: I1203 08:53:46.491756 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:53:46 crc kubenswrapper[4475]: E1203 08:53:46.492725 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:53:59 crc kubenswrapper[4475]: I1203 08:53:59.491156 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:53:59 crc kubenswrapper[4475]: E1203 08:53:59.492908 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:54:11 crc kubenswrapper[4475]: I1203 08:54:11.491649 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:54:11 crc kubenswrapper[4475]: E1203 08:54:11.492616 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:54:24 crc kubenswrapper[4475]: I1203 08:54:24.493846 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:54:24 crc kubenswrapper[4475]: E1203 08:54:24.494710 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 08:54:37 crc kubenswrapper[4475]: I1203 08:54:37.491364 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:54:38 crc kubenswrapper[4475]: I1203 08:54:38.468774 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"e5895b56be7ae3741bf08c7dedf21b20f02efb3fd61a0868e509423412769f8f"} Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.633604 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z6g5b"] Dec 03 08:55:07 crc kubenswrapper[4475]: E1203 08:55:07.638297 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerName="extract-content" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.638335 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerName="extract-content" Dec 03 08:55:07 crc kubenswrapper[4475]: E1203 08:55:07.638354 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerName="registry-server" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.638360 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerName="registry-server" Dec 03 08:55:07 crc kubenswrapper[4475]: E1203 08:55:07.638378 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerName="extract-utilities" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.638386 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerName="extract-utilities" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.638754 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae7195f-7a46-46fc-80e4-442917745ffe" containerName="registry-server" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.646903 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.741775 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6g5b"] Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.797553 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-catalog-content\") pod \"redhat-marketplace-z6g5b\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.797669 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-utilities\") pod \"redhat-marketplace-z6g5b\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.797715 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls7qq\" (UniqueName: \"kubernetes.io/projected/948d3a46-f287-4707-8d45-2cc460bfe8b5-kube-api-access-ls7qq\") pod \"redhat-marketplace-z6g5b\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.898445 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-catalog-content\") pod \"redhat-marketplace-z6g5b\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.898613 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-utilities\") pod \"redhat-marketplace-z6g5b\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.898682 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls7qq\" (UniqueName: \"kubernetes.io/projected/948d3a46-f287-4707-8d45-2cc460bfe8b5-kube-api-access-ls7qq\") pod \"redhat-marketplace-z6g5b\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.900543 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-utilities\") pod \"redhat-marketplace-z6g5b\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.900543 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-catalog-content\") pod \"redhat-marketplace-z6g5b\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.924990 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls7qq\" (UniqueName: \"kubernetes.io/projected/948d3a46-f287-4707-8d45-2cc460bfe8b5-kube-api-access-ls7qq\") pod \"redhat-marketplace-z6g5b\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:07 crc kubenswrapper[4475]: I1203 08:55:07.968950 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:08 crc kubenswrapper[4475]: I1203 08:55:08.786952 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6g5b"] Dec 03 08:55:09 crc kubenswrapper[4475]: I1203 08:55:09.769025 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6g5b" event={"ID":"948d3a46-f287-4707-8d45-2cc460bfe8b5","Type":"ContainerDied","Data":"5133eabcce80e0a5677ee899b1f46b32a060163ebed8d00ecda07a605dc3b381"} Dec 03 08:55:09 crc kubenswrapper[4475]: I1203 08:55:09.768866 4475 generic.go:334] "Generic (PLEG): container finished" podID="948d3a46-f287-4707-8d45-2cc460bfe8b5" containerID="5133eabcce80e0a5677ee899b1f46b32a060163ebed8d00ecda07a605dc3b381" exitCode=0 Dec 03 08:55:09 crc kubenswrapper[4475]: I1203 08:55:09.769801 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6g5b" event={"ID":"948d3a46-f287-4707-8d45-2cc460bfe8b5","Type":"ContainerStarted","Data":"0a1ade7456944bfc3449cde7e1f8e7b9ea85d02eca306675e14a134f84c17746"} Dec 03 08:55:10 crc kubenswrapper[4475]: I1203 08:55:10.780046 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6g5b" event={"ID":"948d3a46-f287-4707-8d45-2cc460bfe8b5","Type":"ContainerStarted","Data":"1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8"} Dec 03 08:55:11 crc kubenswrapper[4475]: I1203 08:55:11.791397 4475 generic.go:334] "Generic (PLEG): container finished" podID="948d3a46-f287-4707-8d45-2cc460bfe8b5" containerID="1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8" exitCode=0 Dec 03 08:55:11 crc kubenswrapper[4475]: I1203 08:55:11.791489 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6g5b" event={"ID":"948d3a46-f287-4707-8d45-2cc460bfe8b5","Type":"ContainerDied","Data":"1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8"} Dec 03 08:55:12 crc kubenswrapper[4475]: I1203 08:55:12.811606 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6g5b" event={"ID":"948d3a46-f287-4707-8d45-2cc460bfe8b5","Type":"ContainerStarted","Data":"df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b"} Dec 03 08:55:12 crc kubenswrapper[4475]: I1203 08:55:12.840949 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z6g5b" podStartSLOduration=3.291468446 podStartE2EDuration="5.839646957s" podCreationTimestamp="2025-12-03 08:55:07 +0000 UTC" firstStartedPulling="2025-12-03 08:55:09.770738819 +0000 UTC m=+7794.575637153" lastFinishedPulling="2025-12-03 08:55:12.31891733 +0000 UTC m=+7797.123815664" observedRunningTime="2025-12-03 08:55:12.830110907 +0000 UTC m=+7797.635009231" watchObservedRunningTime="2025-12-03 08:55:12.839646957 +0000 UTC m=+7797.644545291" Dec 03 08:55:17 crc kubenswrapper[4475]: I1203 08:55:17.970775 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:17 crc kubenswrapper[4475]: I1203 08:55:17.971177 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:18 crc kubenswrapper[4475]: I1203 08:55:18.012441 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:18 crc kubenswrapper[4475]: I1203 08:55:18.917606 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:18 crc kubenswrapper[4475]: I1203 08:55:18.971996 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6g5b"] Dec 03 08:55:20 crc kubenswrapper[4475]: I1203 08:55:20.894109 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z6g5b" podUID="948d3a46-f287-4707-8d45-2cc460bfe8b5" containerName="registry-server" containerID="cri-o://df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b" gracePeriod=2 Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.339191 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.524622 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-utilities\") pod \"948d3a46-f287-4707-8d45-2cc460bfe8b5\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.524908 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-catalog-content\") pod \"948d3a46-f287-4707-8d45-2cc460bfe8b5\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.524943 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls7qq\" (UniqueName: \"kubernetes.io/projected/948d3a46-f287-4707-8d45-2cc460bfe8b5-kube-api-access-ls7qq\") pod \"948d3a46-f287-4707-8d45-2cc460bfe8b5\" (UID: \"948d3a46-f287-4707-8d45-2cc460bfe8b5\") " Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.527282 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-utilities" (OuterVolumeSpecName: "utilities") pod "948d3a46-f287-4707-8d45-2cc460bfe8b5" (UID: "948d3a46-f287-4707-8d45-2cc460bfe8b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.534892 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948d3a46-f287-4707-8d45-2cc460bfe8b5-kube-api-access-ls7qq" (OuterVolumeSpecName: "kube-api-access-ls7qq") pod "948d3a46-f287-4707-8d45-2cc460bfe8b5" (UID: "948d3a46-f287-4707-8d45-2cc460bfe8b5"). InnerVolumeSpecName "kube-api-access-ls7qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.544104 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "948d3a46-f287-4707-8d45-2cc460bfe8b5" (UID: "948d3a46-f287-4707-8d45-2cc460bfe8b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.628088 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.628201 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls7qq\" (UniqueName: \"kubernetes.io/projected/948d3a46-f287-4707-8d45-2cc460bfe8b5-kube-api-access-ls7qq\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.628265 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948d3a46-f287-4707-8d45-2cc460bfe8b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.907385 4475 generic.go:334] "Generic (PLEG): container finished" podID="948d3a46-f287-4707-8d45-2cc460bfe8b5" containerID="df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b" exitCode=0 Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.907458 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6g5b" event={"ID":"948d3a46-f287-4707-8d45-2cc460bfe8b5","Type":"ContainerDied","Data":"df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b"} Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.907505 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6g5b" event={"ID":"948d3a46-f287-4707-8d45-2cc460bfe8b5","Type":"ContainerDied","Data":"0a1ade7456944bfc3449cde7e1f8e7b9ea85d02eca306675e14a134f84c17746"} Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.907518 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6g5b" Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.907527 4475 scope.go:117] "RemoveContainer" containerID="df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b" Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.933134 4475 scope.go:117] "RemoveContainer" containerID="1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8" Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.954627 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6g5b"] Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.966497 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6g5b"] Dec 03 08:55:21 crc kubenswrapper[4475]: I1203 08:55:21.968802 4475 scope.go:117] "RemoveContainer" containerID="5133eabcce80e0a5677ee899b1f46b32a060163ebed8d00ecda07a605dc3b381" Dec 03 08:55:22 crc kubenswrapper[4475]: I1203 08:55:22.012750 4475 scope.go:117] "RemoveContainer" containerID="df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b" Dec 03 08:55:22 crc kubenswrapper[4475]: E1203 08:55:22.014898 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b\": container with ID starting with df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b not found: ID does not exist" containerID="df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b" Dec 03 08:55:22 crc kubenswrapper[4475]: I1203 08:55:22.015278 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b"} err="failed to get container status \"df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b\": rpc error: code = NotFound desc = could not find container \"df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b\": container with ID starting with df5947dcf2e3f869edd21a5209c86e23da331706761f1f1f3beda8871d60623b not found: ID does not exist" Dec 03 08:55:22 crc kubenswrapper[4475]: I1203 08:55:22.015317 4475 scope.go:117] "RemoveContainer" containerID="1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8" Dec 03 08:55:22 crc kubenswrapper[4475]: E1203 08:55:22.015656 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8\": container with ID starting with 1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8 not found: ID does not exist" containerID="1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8" Dec 03 08:55:22 crc kubenswrapper[4475]: I1203 08:55:22.015692 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8"} err="failed to get container status \"1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8\": rpc error: code = NotFound desc = could not find container \"1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8\": container with ID starting with 1a87dfc5a33be0063363100cac334317301b90da292c1addc56e817e789102c8 not found: ID does not exist" Dec 03 08:55:22 crc kubenswrapper[4475]: I1203 08:55:22.015715 4475 scope.go:117] "RemoveContainer" containerID="5133eabcce80e0a5677ee899b1f46b32a060163ebed8d00ecda07a605dc3b381" Dec 03 08:55:22 crc kubenswrapper[4475]: E1203 08:55:22.015936 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5133eabcce80e0a5677ee899b1f46b32a060163ebed8d00ecda07a605dc3b381\": container with ID starting with 5133eabcce80e0a5677ee899b1f46b32a060163ebed8d00ecda07a605dc3b381 not found: ID does not exist" containerID="5133eabcce80e0a5677ee899b1f46b32a060163ebed8d00ecda07a605dc3b381" Dec 03 08:55:22 crc kubenswrapper[4475]: I1203 08:55:22.015962 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5133eabcce80e0a5677ee899b1f46b32a060163ebed8d00ecda07a605dc3b381"} err="failed to get container status \"5133eabcce80e0a5677ee899b1f46b32a060163ebed8d00ecda07a605dc3b381\": rpc error: code = NotFound desc = could not find container \"5133eabcce80e0a5677ee899b1f46b32a060163ebed8d00ecda07a605dc3b381\": container with ID starting with 5133eabcce80e0a5677ee899b1f46b32a060163ebed8d00ecda07a605dc3b381 not found: ID does not exist" Dec 03 08:55:23 crc kubenswrapper[4475]: I1203 08:55:23.505417 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="948d3a46-f287-4707-8d45-2cc460bfe8b5" path="/var/lib/kubelet/pods/948d3a46-f287-4707-8d45-2cc460bfe8b5/volumes" Dec 03 08:56:58 crc kubenswrapper[4475]: I1203 08:56:58.933204 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:56:58 crc kubenswrapper[4475]: I1203 08:56:58.935619 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:57:28 crc kubenswrapper[4475]: I1203 08:57:28.932876 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:57:28 crc kubenswrapper[4475]: I1203 08:57:28.933587 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:57:58 crc kubenswrapper[4475]: I1203 08:57:58.933626 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:57:58 crc kubenswrapper[4475]: I1203 08:57:58.934408 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:57:58 crc kubenswrapper[4475]: I1203 08:57:58.934510 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 08:57:58 crc kubenswrapper[4475]: I1203 08:57:58.935532 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e5895b56be7ae3741bf08c7dedf21b20f02efb3fd61a0868e509423412769f8f"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:57:58 crc kubenswrapper[4475]: I1203 08:57:58.935617 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://e5895b56be7ae3741bf08c7dedf21b20f02efb3fd61a0868e509423412769f8f" gracePeriod=600 Dec 03 08:57:59 crc kubenswrapper[4475]: I1203 08:57:59.352362 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="e5895b56be7ae3741bf08c7dedf21b20f02efb3fd61a0868e509423412769f8f" exitCode=0 Dec 03 08:57:59 crc kubenswrapper[4475]: I1203 08:57:59.352477 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"e5895b56be7ae3741bf08c7dedf21b20f02efb3fd61a0868e509423412769f8f"} Dec 03 08:57:59 crc kubenswrapper[4475]: I1203 08:57:59.352704 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e"} Dec 03 08:57:59 crc kubenswrapper[4475]: I1203 08:57:59.352734 4475 scope.go:117] "RemoveContainer" containerID="a64a4fdb8dce1938b2d1fffd387c463e3afd8518ef83ae41c0e745f161742479" Dec 03 08:58:48 crc kubenswrapper[4475]: I1203 08:58:48.981680 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mncn5"] Dec 03 08:58:48 crc kubenswrapper[4475]: E1203 08:58:48.982659 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948d3a46-f287-4707-8d45-2cc460bfe8b5" containerName="extract-utilities" Dec 03 08:58:48 crc kubenswrapper[4475]: I1203 08:58:48.982676 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="948d3a46-f287-4707-8d45-2cc460bfe8b5" containerName="extract-utilities" Dec 03 08:58:48 crc kubenswrapper[4475]: E1203 08:58:48.982693 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948d3a46-f287-4707-8d45-2cc460bfe8b5" containerName="registry-server" Dec 03 08:58:48 crc kubenswrapper[4475]: I1203 08:58:48.982698 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="948d3a46-f287-4707-8d45-2cc460bfe8b5" containerName="registry-server" Dec 03 08:58:48 crc kubenswrapper[4475]: E1203 08:58:48.982716 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948d3a46-f287-4707-8d45-2cc460bfe8b5" containerName="extract-content" Dec 03 08:58:48 crc kubenswrapper[4475]: I1203 08:58:48.982722 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="948d3a46-f287-4707-8d45-2cc460bfe8b5" containerName="extract-content" Dec 03 08:58:48 crc kubenswrapper[4475]: I1203 08:58:48.982911 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="948d3a46-f287-4707-8d45-2cc460bfe8b5" containerName="registry-server" Dec 03 08:58:48 crc kubenswrapper[4475]: I1203 08:58:48.984287 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.010231 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mncn5"] Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.145834 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84668\" (UniqueName: \"kubernetes.io/projected/be0a6479-a0bd-4843-8cab-69317fe4b9e9-kube-api-access-84668\") pod \"certified-operators-mncn5\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.146041 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-utilities\") pod \"certified-operators-mncn5\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.146156 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-catalog-content\") pod \"certified-operators-mncn5\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.249083 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84668\" (UniqueName: \"kubernetes.io/projected/be0a6479-a0bd-4843-8cab-69317fe4b9e9-kube-api-access-84668\") pod \"certified-operators-mncn5\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.249242 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-utilities\") pod \"certified-operators-mncn5\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.249285 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-catalog-content\") pod \"certified-operators-mncn5\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.249773 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-catalog-content\") pod \"certified-operators-mncn5\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.249856 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-utilities\") pod \"certified-operators-mncn5\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.271299 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84668\" (UniqueName: \"kubernetes.io/projected/be0a6479-a0bd-4843-8cab-69317fe4b9e9-kube-api-access-84668\") pod \"certified-operators-mncn5\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.316064 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:49 crc kubenswrapper[4475]: I1203 08:58:49.922062 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mncn5"] Dec 03 08:58:50 crc kubenswrapper[4475]: I1203 08:58:50.849977 4475 generic.go:334] "Generic (PLEG): container finished" podID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" containerID="4345b10f67ac5e5cf3cee5a175979007f2a11cfcda248847090a33470996d7fc" exitCode=0 Dec 03 08:58:50 crc kubenswrapper[4475]: I1203 08:58:50.850084 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mncn5" event={"ID":"be0a6479-a0bd-4843-8cab-69317fe4b9e9","Type":"ContainerDied","Data":"4345b10f67ac5e5cf3cee5a175979007f2a11cfcda248847090a33470996d7fc"} Dec 03 08:58:50 crc kubenswrapper[4475]: I1203 08:58:50.850276 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mncn5" event={"ID":"be0a6479-a0bd-4843-8cab-69317fe4b9e9","Type":"ContainerStarted","Data":"bc67e423569654894a1cc0c3d9ec5976045a997fe206152c64442ed85767efec"} Dec 03 08:58:50 crc kubenswrapper[4475]: I1203 08:58:50.852270 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:58:52 crc kubenswrapper[4475]: I1203 08:58:52.871344 4475 generic.go:334] "Generic (PLEG): container finished" podID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" containerID="8f5e2bd500cf8a1bbbba1b59618722e413a6874898ea3685e96d0b020c224706" exitCode=0 Dec 03 08:58:52 crc kubenswrapper[4475]: I1203 08:58:52.871892 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mncn5" event={"ID":"be0a6479-a0bd-4843-8cab-69317fe4b9e9","Type":"ContainerDied","Data":"8f5e2bd500cf8a1bbbba1b59618722e413a6874898ea3685e96d0b020c224706"} Dec 03 08:58:53 crc kubenswrapper[4475]: I1203 08:58:53.882926 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mncn5" event={"ID":"be0a6479-a0bd-4843-8cab-69317fe4b9e9","Type":"ContainerStarted","Data":"9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be"} Dec 03 08:58:59 crc kubenswrapper[4475]: I1203 08:58:59.316765 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:59 crc kubenswrapper[4475]: I1203 08:58:59.317371 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:59 crc kubenswrapper[4475]: I1203 08:58:59.359438 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:58:59 crc kubenswrapper[4475]: I1203 08:58:59.378844 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mncn5" podStartSLOduration=8.890063552 podStartE2EDuration="11.378825906s" podCreationTimestamp="2025-12-03 08:58:48 +0000 UTC" firstStartedPulling="2025-12-03 08:58:50.851946856 +0000 UTC m=+8015.656845190" lastFinishedPulling="2025-12-03 08:58:53.340709209 +0000 UTC m=+8018.145607544" observedRunningTime="2025-12-03 08:58:53.939681822 +0000 UTC m=+8018.744580157" watchObservedRunningTime="2025-12-03 08:58:59.378825906 +0000 UTC m=+8024.183724241" Dec 03 08:58:59 crc kubenswrapper[4475]: I1203 08:58:59.983283 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:59:00 crc kubenswrapper[4475]: I1203 08:59:00.040162 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mncn5"] Dec 03 08:59:01 crc kubenswrapper[4475]: I1203 08:59:01.956945 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mncn5" podUID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" containerName="registry-server" containerID="cri-o://9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be" gracePeriod=2 Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.396749 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.466343 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-utilities\") pod \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.466416 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84668\" (UniqueName: \"kubernetes.io/projected/be0a6479-a0bd-4843-8cab-69317fe4b9e9-kube-api-access-84668\") pod \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.466492 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-catalog-content\") pod \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\" (UID: \"be0a6479-a0bd-4843-8cab-69317fe4b9e9\") " Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.466961 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-utilities" (OuterVolumeSpecName: "utilities") pod "be0a6479-a0bd-4843-8cab-69317fe4b9e9" (UID: "be0a6479-a0bd-4843-8cab-69317fe4b9e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.467322 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.483649 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be0a6479-a0bd-4843-8cab-69317fe4b9e9-kube-api-access-84668" (OuterVolumeSpecName: "kube-api-access-84668") pod "be0a6479-a0bd-4843-8cab-69317fe4b9e9" (UID: "be0a6479-a0bd-4843-8cab-69317fe4b9e9"). InnerVolumeSpecName "kube-api-access-84668". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.502807 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be0a6479-a0bd-4843-8cab-69317fe4b9e9" (UID: "be0a6479-a0bd-4843-8cab-69317fe4b9e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.569330 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be0a6479-a0bd-4843-8cab-69317fe4b9e9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.569353 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84668\" (UniqueName: \"kubernetes.io/projected/be0a6479-a0bd-4843-8cab-69317fe4b9e9-kube-api-access-84668\") on node \"crc\" DevicePath \"\"" Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.965484 4475 generic.go:334] "Generic (PLEG): container finished" podID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" containerID="9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be" exitCode=0 Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.965553 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mncn5" event={"ID":"be0a6479-a0bd-4843-8cab-69317fe4b9e9","Type":"ContainerDied","Data":"9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be"} Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.965780 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mncn5" event={"ID":"be0a6479-a0bd-4843-8cab-69317fe4b9e9","Type":"ContainerDied","Data":"bc67e423569654894a1cc0c3d9ec5976045a997fe206152c64442ed85767efec"} Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.965804 4475 scope.go:117] "RemoveContainer" containerID="9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be" Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.965575 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mncn5" Dec 03 08:59:02 crc kubenswrapper[4475]: I1203 08:59:02.991123 4475 scope.go:117] "RemoveContainer" containerID="8f5e2bd500cf8a1bbbba1b59618722e413a6874898ea3685e96d0b020c224706" Dec 03 08:59:03 crc kubenswrapper[4475]: I1203 08:59:03.016438 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mncn5"] Dec 03 08:59:03 crc kubenswrapper[4475]: I1203 08:59:03.026700 4475 scope.go:117] "RemoveContainer" containerID="4345b10f67ac5e5cf3cee5a175979007f2a11cfcda248847090a33470996d7fc" Dec 03 08:59:03 crc kubenswrapper[4475]: I1203 08:59:03.029395 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mncn5"] Dec 03 08:59:03 crc kubenswrapper[4475]: I1203 08:59:03.061749 4475 scope.go:117] "RemoveContainer" containerID="9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be" Dec 03 08:59:03 crc kubenswrapper[4475]: E1203 08:59:03.062181 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be\": container with ID starting with 9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be not found: ID does not exist" containerID="9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be" Dec 03 08:59:03 crc kubenswrapper[4475]: I1203 08:59:03.062215 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be"} err="failed to get container status \"9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be\": rpc error: code = NotFound desc = could not find container \"9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be\": container with ID starting with 9dd6665bf0f9931258dcab58f4b1ee5a1063a3776cbe72bc419d0d0af6c768be not found: ID does not exist" Dec 03 08:59:03 crc kubenswrapper[4475]: I1203 08:59:03.062237 4475 scope.go:117] "RemoveContainer" containerID="8f5e2bd500cf8a1bbbba1b59618722e413a6874898ea3685e96d0b020c224706" Dec 03 08:59:03 crc kubenswrapper[4475]: E1203 08:59:03.062625 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f5e2bd500cf8a1bbbba1b59618722e413a6874898ea3685e96d0b020c224706\": container with ID starting with 8f5e2bd500cf8a1bbbba1b59618722e413a6874898ea3685e96d0b020c224706 not found: ID does not exist" containerID="8f5e2bd500cf8a1bbbba1b59618722e413a6874898ea3685e96d0b020c224706" Dec 03 08:59:03 crc kubenswrapper[4475]: I1203 08:59:03.062661 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f5e2bd500cf8a1bbbba1b59618722e413a6874898ea3685e96d0b020c224706"} err="failed to get container status \"8f5e2bd500cf8a1bbbba1b59618722e413a6874898ea3685e96d0b020c224706\": rpc error: code = NotFound desc = could not find container \"8f5e2bd500cf8a1bbbba1b59618722e413a6874898ea3685e96d0b020c224706\": container with ID starting with 8f5e2bd500cf8a1bbbba1b59618722e413a6874898ea3685e96d0b020c224706 not found: ID does not exist" Dec 03 08:59:03 crc kubenswrapper[4475]: I1203 08:59:03.062690 4475 scope.go:117] "RemoveContainer" containerID="4345b10f67ac5e5cf3cee5a175979007f2a11cfcda248847090a33470996d7fc" Dec 03 08:59:03 crc kubenswrapper[4475]: E1203 08:59:03.084607 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4345b10f67ac5e5cf3cee5a175979007f2a11cfcda248847090a33470996d7fc\": container with ID starting with 4345b10f67ac5e5cf3cee5a175979007f2a11cfcda248847090a33470996d7fc not found: ID does not exist" containerID="4345b10f67ac5e5cf3cee5a175979007f2a11cfcda248847090a33470996d7fc" Dec 03 08:59:03 crc kubenswrapper[4475]: I1203 08:59:03.084642 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4345b10f67ac5e5cf3cee5a175979007f2a11cfcda248847090a33470996d7fc"} err="failed to get container status \"4345b10f67ac5e5cf3cee5a175979007f2a11cfcda248847090a33470996d7fc\": rpc error: code = NotFound desc = could not find container \"4345b10f67ac5e5cf3cee5a175979007f2a11cfcda248847090a33470996d7fc\": container with ID starting with 4345b10f67ac5e5cf3cee5a175979007f2a11cfcda248847090a33470996d7fc not found: ID does not exist" Dec 03 08:59:03 crc kubenswrapper[4475]: I1203 08:59:03.499933 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" path="/var/lib/kubelet/pods/be0a6479-a0bd-4843-8cab-69317fe4b9e9/volumes" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.247358 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v5zfv"] Dec 03 08:59:52 crc kubenswrapper[4475]: E1203 08:59:52.248101 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" containerName="registry-server" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.248115 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" containerName="registry-server" Dec 03 08:59:52 crc kubenswrapper[4475]: E1203 08:59:52.248146 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" containerName="extract-utilities" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.248153 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" containerName="extract-utilities" Dec 03 08:59:52 crc kubenswrapper[4475]: E1203 08:59:52.248160 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" containerName="extract-content" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.248165 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" containerName="extract-content" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.248386 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="be0a6479-a0bd-4843-8cab-69317fe4b9e9" containerName="registry-server" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.249807 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5zfv" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.259668 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v5zfv"] Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.281825 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrt7c\" (UniqueName: \"kubernetes.io/projected/79353e59-e700-4bdf-abbf-07d051fc8409-kube-api-access-rrt7c\") pod \"community-operators-v5zfv\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " pod="openshift-marketplace/community-operators-v5zfv" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.281880 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-utilities\") pod \"community-operators-v5zfv\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " pod="openshift-marketplace/community-operators-v5zfv" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.281999 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-catalog-content\") pod \"community-operators-v5zfv\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " pod="openshift-marketplace/community-operators-v5zfv" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.383322 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-catalog-content\") pod \"community-operators-v5zfv\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " pod="openshift-marketplace/community-operators-v5zfv" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.383420 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrt7c\" (UniqueName: \"kubernetes.io/projected/79353e59-e700-4bdf-abbf-07d051fc8409-kube-api-access-rrt7c\") pod \"community-operators-v5zfv\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " pod="openshift-marketplace/community-operators-v5zfv" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.383467 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-utilities\") pod \"community-operators-v5zfv\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " pod="openshift-marketplace/community-operators-v5zfv" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.383761 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-catalog-content\") pod \"community-operators-v5zfv\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " pod="openshift-marketplace/community-operators-v5zfv" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.383841 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-utilities\") pod \"community-operators-v5zfv\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " pod="openshift-marketplace/community-operators-v5zfv" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.402422 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrt7c\" (UniqueName: \"kubernetes.io/projected/79353e59-e700-4bdf-abbf-07d051fc8409-kube-api-access-rrt7c\") pod \"community-operators-v5zfv\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " pod="openshift-marketplace/community-operators-v5zfv" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.564710 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5zfv" Dec 03 08:59:52 crc kubenswrapper[4475]: I1203 08:59:52.987369 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v5zfv"] Dec 03 08:59:53 crc kubenswrapper[4475]: I1203 08:59:53.396988 4475 generic.go:334] "Generic (PLEG): container finished" podID="79353e59-e700-4bdf-abbf-07d051fc8409" containerID="6db11d406a3562bc2720c791325a9a86902a4608a19b3653fa25d39dab98521e" exitCode=0 Dec 03 08:59:53 crc kubenswrapper[4475]: I1203 08:59:53.397196 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5zfv" event={"ID":"79353e59-e700-4bdf-abbf-07d051fc8409","Type":"ContainerDied","Data":"6db11d406a3562bc2720c791325a9a86902a4608a19b3653fa25d39dab98521e"} Dec 03 08:59:53 crc kubenswrapper[4475]: I1203 08:59:53.397661 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5zfv" event={"ID":"79353e59-e700-4bdf-abbf-07d051fc8409","Type":"ContainerStarted","Data":"924e8043e2eebf3d1693819fbfc6fd47928fd261ae10f49e775bd76ad887166c"} Dec 03 08:59:54 crc kubenswrapper[4475]: I1203 08:59:54.405161 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5zfv" event={"ID":"79353e59-e700-4bdf-abbf-07d051fc8409","Type":"ContainerStarted","Data":"9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f"} Dec 03 08:59:55 crc kubenswrapper[4475]: I1203 08:59:55.418511 4475 generic.go:334] "Generic (PLEG): container finished" podID="79353e59-e700-4bdf-abbf-07d051fc8409" containerID="9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f" exitCode=0 Dec 03 08:59:55 crc kubenswrapper[4475]: I1203 08:59:55.418572 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5zfv" event={"ID":"79353e59-e700-4bdf-abbf-07d051fc8409","Type":"ContainerDied","Data":"9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f"} Dec 03 08:59:56 crc kubenswrapper[4475]: I1203 08:59:56.429775 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5zfv" event={"ID":"79353e59-e700-4bdf-abbf-07d051fc8409","Type":"ContainerStarted","Data":"f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75"} Dec 03 08:59:56 crc kubenswrapper[4475]: I1203 08:59:56.450033 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v5zfv" podStartSLOduration=1.936843509 podStartE2EDuration="4.45000527s" podCreationTimestamp="2025-12-03 08:59:52 +0000 UTC" firstStartedPulling="2025-12-03 08:59:53.406249474 +0000 UTC m=+8078.211147808" lastFinishedPulling="2025-12-03 08:59:55.919411236 +0000 UTC m=+8080.724309569" observedRunningTime="2025-12-03 08:59:56.449145253 +0000 UTC m=+8081.254043587" watchObservedRunningTime="2025-12-03 08:59:56.45000527 +0000 UTC m=+8081.254903604" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.197821 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp"] Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.200348 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.211506 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp"] Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.212531 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.212628 4475 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.239577 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-secret-volume\") pod \"collect-profiles-29412540-9mxxp\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.239753 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29brh\" (UniqueName: \"kubernetes.io/projected/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-kube-api-access-29brh\") pod \"collect-profiles-29412540-9mxxp\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.240077 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-config-volume\") pod \"collect-profiles-29412540-9mxxp\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.341670 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-secret-volume\") pod \"collect-profiles-29412540-9mxxp\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.341764 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29brh\" (UniqueName: \"kubernetes.io/projected/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-kube-api-access-29brh\") pod \"collect-profiles-29412540-9mxxp\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.341842 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-config-volume\") pod \"collect-profiles-29412540-9mxxp\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.342760 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-config-volume\") pod \"collect-profiles-29412540-9mxxp\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.352847 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-secret-volume\") pod \"collect-profiles-29412540-9mxxp\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.356840 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29brh\" (UniqueName: \"kubernetes.io/projected/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-kube-api-access-29brh\") pod \"collect-profiles-29412540-9mxxp\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:00 crc kubenswrapper[4475]: I1203 09:00:00.522689 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:01 crc kubenswrapper[4475]: I1203 09:00:01.041744 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp"] Dec 03 09:00:01 crc kubenswrapper[4475]: W1203 09:00:01.047822 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f080c66_b9b0_4d72_9af2_90c5ae5d8fff.slice/crio-559a0cce8485b5b11aa20e4995fbfb9e10b7347be79e1d9d6e57f2631d99b79e WatchSource:0}: Error finding container 559a0cce8485b5b11aa20e4995fbfb9e10b7347be79e1d9d6e57f2631d99b79e: Status 404 returned error can't find the container with id 559a0cce8485b5b11aa20e4995fbfb9e10b7347be79e1d9d6e57f2631d99b79e Dec 03 09:00:01 crc kubenswrapper[4475]: I1203 09:00:01.476643 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" event={"ID":"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff","Type":"ContainerStarted","Data":"240dbb8b098b6b3ba1f349c48ccadf6daef0df668aebd4541b5f936e8a4d1df0"} Dec 03 09:00:01 crc kubenswrapper[4475]: I1203 09:00:01.476727 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" event={"ID":"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff","Type":"ContainerStarted","Data":"559a0cce8485b5b11aa20e4995fbfb9e10b7347be79e1d9d6e57f2631d99b79e"} Dec 03 09:00:02 crc kubenswrapper[4475]: I1203 09:00:02.485263 4475 generic.go:334] "Generic (PLEG): container finished" podID="6f080c66-b9b0-4d72-9af2-90c5ae5d8fff" containerID="240dbb8b098b6b3ba1f349c48ccadf6daef0df668aebd4541b5f936e8a4d1df0" exitCode=0 Dec 03 09:00:02 crc kubenswrapper[4475]: I1203 09:00:02.485465 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" event={"ID":"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff","Type":"ContainerDied","Data":"240dbb8b098b6b3ba1f349c48ccadf6daef0df668aebd4541b5f936e8a4d1df0"} Dec 03 09:00:02 crc kubenswrapper[4475]: I1203 09:00:02.565529 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v5zfv" Dec 03 09:00:02 crc kubenswrapper[4475]: I1203 09:00:02.565586 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v5zfv" Dec 03 09:00:02 crc kubenswrapper[4475]: I1203 09:00:02.607497 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v5zfv" Dec 03 09:00:03 crc kubenswrapper[4475]: I1203 09:00:03.530660 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v5zfv" Dec 03 09:00:03 crc kubenswrapper[4475]: I1203 09:00:03.566986 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v5zfv"] Dec 03 09:00:03 crc kubenswrapper[4475]: I1203 09:00:03.807062 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:03 crc kubenswrapper[4475]: I1203 09:00:03.927335 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29brh\" (UniqueName: \"kubernetes.io/projected/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-kube-api-access-29brh\") pod \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " Dec 03 09:00:03 crc kubenswrapper[4475]: I1203 09:00:03.927658 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-secret-volume\") pod \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " Dec 03 09:00:03 crc kubenswrapper[4475]: I1203 09:00:03.927767 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-config-volume\") pod \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\" (UID: \"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff\") " Dec 03 09:00:03 crc kubenswrapper[4475]: I1203 09:00:03.928699 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-config-volume" (OuterVolumeSpecName: "config-volume") pod "6f080c66-b9b0-4d72-9af2-90c5ae5d8fff" (UID: "6f080c66-b9b0-4d72-9af2-90c5ae5d8fff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:00:03 crc kubenswrapper[4475]: I1203 09:00:03.934096 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6f080c66-b9b0-4d72-9af2-90c5ae5d8fff" (UID: "6f080c66-b9b0-4d72-9af2-90c5ae5d8fff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:00:03 crc kubenswrapper[4475]: I1203 09:00:03.934939 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-kube-api-access-29brh" (OuterVolumeSpecName: "kube-api-access-29brh") pod "6f080c66-b9b0-4d72-9af2-90c5ae5d8fff" (UID: "6f080c66-b9b0-4d72-9af2-90c5ae5d8fff"). InnerVolumeSpecName "kube-api-access-29brh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:00:04 crc kubenswrapper[4475]: I1203 09:00:04.030081 4475 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:04 crc kubenswrapper[4475]: I1203 09:00:04.030111 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29brh\" (UniqueName: \"kubernetes.io/projected/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-kube-api-access-29brh\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:04 crc kubenswrapper[4475]: I1203 09:00:04.030125 4475 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f080c66-b9b0-4d72-9af2-90c5ae5d8fff-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:04 crc kubenswrapper[4475]: I1203 09:00:04.502213 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" event={"ID":"6f080c66-b9b0-4d72-9af2-90c5ae5d8fff","Type":"ContainerDied","Data":"559a0cce8485b5b11aa20e4995fbfb9e10b7347be79e1d9d6e57f2631d99b79e"} Dec 03 09:00:04 crc kubenswrapper[4475]: I1203 09:00:04.502400 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="559a0cce8485b5b11aa20e4995fbfb9e10b7347be79e1d9d6e57f2631d99b79e" Dec 03 09:00:04 crc kubenswrapper[4475]: I1203 09:00:04.502245 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-9mxxp" Dec 03 09:00:04 crc kubenswrapper[4475]: I1203 09:00:04.580655 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv"] Dec 03 09:00:04 crc kubenswrapper[4475]: I1203 09:00:04.588152 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-vfswv"] Dec 03 09:00:05 crc kubenswrapper[4475]: I1203 09:00:05.501549 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eaed4ee-f926-4fce-a220-dd941a464f6c" path="/var/lib/kubelet/pods/7eaed4ee-f926-4fce-a220-dd941a464f6c/volumes" Dec 03 09:00:05 crc kubenswrapper[4475]: I1203 09:00:05.509733 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v5zfv" podUID="79353e59-e700-4bdf-abbf-07d051fc8409" containerName="registry-server" containerID="cri-o://f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75" gracePeriod=2 Dec 03 09:00:05 crc kubenswrapper[4475]: I1203 09:00:05.883852 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5zfv" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.076152 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-catalog-content\") pod \"79353e59-e700-4bdf-abbf-07d051fc8409\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.076893 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrt7c\" (UniqueName: \"kubernetes.io/projected/79353e59-e700-4bdf-abbf-07d051fc8409-kube-api-access-rrt7c\") pod \"79353e59-e700-4bdf-abbf-07d051fc8409\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.077091 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-utilities\") pod \"79353e59-e700-4bdf-abbf-07d051fc8409\" (UID: \"79353e59-e700-4bdf-abbf-07d051fc8409\") " Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.077539 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-utilities" (OuterVolumeSpecName: "utilities") pod "79353e59-e700-4bdf-abbf-07d051fc8409" (UID: "79353e59-e700-4bdf-abbf-07d051fc8409"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.078714 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.083941 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79353e59-e700-4bdf-abbf-07d051fc8409-kube-api-access-rrt7c" (OuterVolumeSpecName: "kube-api-access-rrt7c") pod "79353e59-e700-4bdf-abbf-07d051fc8409" (UID: "79353e59-e700-4bdf-abbf-07d051fc8409"). InnerVolumeSpecName "kube-api-access-rrt7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.130583 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79353e59-e700-4bdf-abbf-07d051fc8409" (UID: "79353e59-e700-4bdf-abbf-07d051fc8409"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.180289 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79353e59-e700-4bdf-abbf-07d051fc8409-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.180324 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrt7c\" (UniqueName: \"kubernetes.io/projected/79353e59-e700-4bdf-abbf-07d051fc8409-kube-api-access-rrt7c\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.519494 4475 generic.go:334] "Generic (PLEG): container finished" podID="79353e59-e700-4bdf-abbf-07d051fc8409" containerID="f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75" exitCode=0 Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.519541 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5zfv" event={"ID":"79353e59-e700-4bdf-abbf-07d051fc8409","Type":"ContainerDied","Data":"f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75"} Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.519555 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5zfv" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.519581 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5zfv" event={"ID":"79353e59-e700-4bdf-abbf-07d051fc8409","Type":"ContainerDied","Data":"924e8043e2eebf3d1693819fbfc6fd47928fd261ae10f49e775bd76ad887166c"} Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.519602 4475 scope.go:117] "RemoveContainer" containerID="f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.545571 4475 scope.go:117] "RemoveContainer" containerID="9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.553539 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v5zfv"] Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.561834 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v5zfv"] Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.587429 4475 scope.go:117] "RemoveContainer" containerID="6db11d406a3562bc2720c791325a9a86902a4608a19b3653fa25d39dab98521e" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.605137 4475 scope.go:117] "RemoveContainer" containerID="f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75" Dec 03 09:00:06 crc kubenswrapper[4475]: E1203 09:00:06.605653 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75\": container with ID starting with f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75 not found: ID does not exist" containerID="f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.605684 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75"} err="failed to get container status \"f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75\": rpc error: code = NotFound desc = could not find container \"f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75\": container with ID starting with f9ee1299e7008f54d24eb69f6b93ff00774c360c14519c3c84ce64fb27896a75 not found: ID does not exist" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.605706 4475 scope.go:117] "RemoveContainer" containerID="9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f" Dec 03 09:00:06 crc kubenswrapper[4475]: E1203 09:00:06.606126 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f\": container with ID starting with 9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f not found: ID does not exist" containerID="9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.606170 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f"} err="failed to get container status \"9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f\": rpc error: code = NotFound desc = could not find container \"9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f\": container with ID starting with 9c1c639e04505ea02b4770a57b225ebdaec5ecf30375dfb1ddc5e84dbbfd9b3f not found: ID does not exist" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.606196 4475 scope.go:117] "RemoveContainer" containerID="6db11d406a3562bc2720c791325a9a86902a4608a19b3653fa25d39dab98521e" Dec 03 09:00:06 crc kubenswrapper[4475]: E1203 09:00:06.606565 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6db11d406a3562bc2720c791325a9a86902a4608a19b3653fa25d39dab98521e\": container with ID starting with 6db11d406a3562bc2720c791325a9a86902a4608a19b3653fa25d39dab98521e not found: ID does not exist" containerID="6db11d406a3562bc2720c791325a9a86902a4608a19b3653fa25d39dab98521e" Dec 03 09:00:06 crc kubenswrapper[4475]: I1203 09:00:06.606594 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6db11d406a3562bc2720c791325a9a86902a4608a19b3653fa25d39dab98521e"} err="failed to get container status \"6db11d406a3562bc2720c791325a9a86902a4608a19b3653fa25d39dab98521e\": rpc error: code = NotFound desc = could not find container \"6db11d406a3562bc2720c791325a9a86902a4608a19b3653fa25d39dab98521e\": container with ID starting with 6db11d406a3562bc2720c791325a9a86902a4608a19b3653fa25d39dab98521e not found: ID does not exist" Dec 03 09:00:07 crc kubenswrapper[4475]: I1203 09:00:07.500181 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79353e59-e700-4bdf-abbf-07d051fc8409" path="/var/lib/kubelet/pods/79353e59-e700-4bdf-abbf-07d051fc8409/volumes" Dec 03 09:00:28 crc kubenswrapper[4475]: I1203 09:00:28.933394 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:00:28 crc kubenswrapper[4475]: I1203 09:00:28.933816 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:00:31 crc kubenswrapper[4475]: I1203 09:00:31.634770 4475 scope.go:117] "RemoveContainer" containerID="8936dc26750955769a4c771137e77ad6db46e9b8fc4f0b236b839242ae8d2a4f" Dec 03 09:00:58 crc kubenswrapper[4475]: I1203 09:00:58.933322 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:00:58 crc kubenswrapper[4475]: I1203 09:00:58.933963 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.155206 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412541-b5hgg"] Dec 03 09:01:00 crc kubenswrapper[4475]: E1203 09:01:00.155576 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f080c66-b9b0-4d72-9af2-90c5ae5d8fff" containerName="collect-profiles" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.155590 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f080c66-b9b0-4d72-9af2-90c5ae5d8fff" containerName="collect-profiles" Dec 03 09:01:00 crc kubenswrapper[4475]: E1203 09:01:00.155603 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79353e59-e700-4bdf-abbf-07d051fc8409" containerName="extract-content" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.155609 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="79353e59-e700-4bdf-abbf-07d051fc8409" containerName="extract-content" Dec 03 09:01:00 crc kubenswrapper[4475]: E1203 09:01:00.155627 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79353e59-e700-4bdf-abbf-07d051fc8409" containerName="extract-utilities" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.155633 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="79353e59-e700-4bdf-abbf-07d051fc8409" containerName="extract-utilities" Dec 03 09:01:00 crc kubenswrapper[4475]: E1203 09:01:00.155653 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79353e59-e700-4bdf-abbf-07d051fc8409" containerName="registry-server" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.155659 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="79353e59-e700-4bdf-abbf-07d051fc8409" containerName="registry-server" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.155843 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="79353e59-e700-4bdf-abbf-07d051fc8409" containerName="registry-server" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.155860 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f080c66-b9b0-4d72-9af2-90c5ae5d8fff" containerName="collect-profiles" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.156522 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.165387 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412541-b5hgg"] Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.345099 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-combined-ca-bundle\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.345229 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8sb6\" (UniqueName: \"kubernetes.io/projected/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-kube-api-access-j8sb6\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.345284 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-fernet-keys\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.345333 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-config-data\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.446926 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8sb6\" (UniqueName: \"kubernetes.io/projected/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-kube-api-access-j8sb6\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.446998 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-fernet-keys\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.447022 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-config-data\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.447159 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-combined-ca-bundle\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.455389 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-combined-ca-bundle\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.455511 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-fernet-keys\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.456681 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-config-data\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.462210 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8sb6\" (UniqueName: \"kubernetes.io/projected/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-kube-api-access-j8sb6\") pod \"keystone-cron-29412541-b5hgg\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.480133 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.901389 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412541-b5hgg"] Dec 03 09:01:00 crc kubenswrapper[4475]: I1203 09:01:00.935058 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-b5hgg" event={"ID":"73d28f65-1b41-47d2-8f4d-0db7234ae4e9","Type":"ContainerStarted","Data":"57929b5bcba6ee046358853c569e8c3cefc2326d1cdbdfceba1764e6e36e46e9"} Dec 03 09:01:01 crc kubenswrapper[4475]: I1203 09:01:01.956029 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-b5hgg" event={"ID":"73d28f65-1b41-47d2-8f4d-0db7234ae4e9","Type":"ContainerStarted","Data":"29c3a6a496725f8ffbbeb85e9723a7294add5d6c59c0be1cb84ebce15be13412"} Dec 03 09:01:01 crc kubenswrapper[4475]: I1203 09:01:01.979723 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412541-b5hgg" podStartSLOduration=1.979701081 podStartE2EDuration="1.979701081s" podCreationTimestamp="2025-12-03 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:01:01.977425953 +0000 UTC m=+8146.782324297" watchObservedRunningTime="2025-12-03 09:01:01.979701081 +0000 UTC m=+8146.784599415" Dec 03 09:01:03 crc kubenswrapper[4475]: I1203 09:01:03.975377 4475 generic.go:334] "Generic (PLEG): container finished" podID="73d28f65-1b41-47d2-8f4d-0db7234ae4e9" containerID="29c3a6a496725f8ffbbeb85e9723a7294add5d6c59c0be1cb84ebce15be13412" exitCode=0 Dec 03 09:01:03 crc kubenswrapper[4475]: I1203 09:01:03.975478 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-b5hgg" event={"ID":"73d28f65-1b41-47d2-8f4d-0db7234ae4e9","Type":"ContainerDied","Data":"29c3a6a496725f8ffbbeb85e9723a7294add5d6c59c0be1cb84ebce15be13412"} Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.334336 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.352162 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-fernet-keys\") pod \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.359128 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "73d28f65-1b41-47d2-8f4d-0db7234ae4e9" (UID: "73d28f65-1b41-47d2-8f4d-0db7234ae4e9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.453702 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8sb6\" (UniqueName: \"kubernetes.io/projected/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-kube-api-access-j8sb6\") pod \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.454041 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-combined-ca-bundle\") pod \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.454169 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-config-data\") pod \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\" (UID: \"73d28f65-1b41-47d2-8f4d-0db7234ae4e9\") " Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.454527 4475 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.457215 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-kube-api-access-j8sb6" (OuterVolumeSpecName: "kube-api-access-j8sb6") pod "73d28f65-1b41-47d2-8f4d-0db7234ae4e9" (UID: "73d28f65-1b41-47d2-8f4d-0db7234ae4e9"). InnerVolumeSpecName "kube-api-access-j8sb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.475999 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73d28f65-1b41-47d2-8f4d-0db7234ae4e9" (UID: "73d28f65-1b41-47d2-8f4d-0db7234ae4e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.490126 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-config-data" (OuterVolumeSpecName: "config-data") pod "73d28f65-1b41-47d2-8f4d-0db7234ae4e9" (UID: "73d28f65-1b41-47d2-8f4d-0db7234ae4e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.557409 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8sb6\" (UniqueName: \"kubernetes.io/projected/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-kube-api-access-j8sb6\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.557493 4475 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.557506 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d28f65-1b41-47d2-8f4d-0db7234ae4e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.992380 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-b5hgg" event={"ID":"73d28f65-1b41-47d2-8f4d-0db7234ae4e9","Type":"ContainerDied","Data":"57929b5bcba6ee046358853c569e8c3cefc2326d1cdbdfceba1764e6e36e46e9"} Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.992425 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-b5hgg" Dec 03 09:01:05 crc kubenswrapper[4475]: I1203 09:01:05.992430 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57929b5bcba6ee046358853c569e8c3cefc2326d1cdbdfceba1764e6e36e46e9" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.701971 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7g4zd"] Dec 03 09:01:24 crc kubenswrapper[4475]: E1203 09:01:24.703334 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d28f65-1b41-47d2-8f4d-0db7234ae4e9" containerName="keystone-cron" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.703352 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d28f65-1b41-47d2-8f4d-0db7234ae4e9" containerName="keystone-cron" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.703654 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d28f65-1b41-47d2-8f4d-0db7234ae4e9" containerName="keystone-cron" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.706324 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.712151 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7g4zd"] Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.771923 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-utilities\") pod \"redhat-operators-7g4zd\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.772066 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-catalog-content\") pod \"redhat-operators-7g4zd\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.772153 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfc7m\" (UniqueName: \"kubernetes.io/projected/cbeeabad-22f8-4623-b997-6336ccc2344b-kube-api-access-lfc7m\") pod \"redhat-operators-7g4zd\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.873689 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-utilities\") pod \"redhat-operators-7g4zd\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.873894 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-catalog-content\") pod \"redhat-operators-7g4zd\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.874026 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfc7m\" (UniqueName: \"kubernetes.io/projected/cbeeabad-22f8-4623-b997-6336ccc2344b-kube-api-access-lfc7m\") pod \"redhat-operators-7g4zd\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.874217 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-utilities\") pod \"redhat-operators-7g4zd\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.874388 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-catalog-content\") pod \"redhat-operators-7g4zd\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:24 crc kubenswrapper[4475]: I1203 09:01:24.898392 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfc7m\" (UniqueName: \"kubernetes.io/projected/cbeeabad-22f8-4623-b997-6336ccc2344b-kube-api-access-lfc7m\") pod \"redhat-operators-7g4zd\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:25 crc kubenswrapper[4475]: I1203 09:01:25.034033 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:25 crc kubenswrapper[4475]: W1203 09:01:25.592147 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbeeabad_22f8_4623_b997_6336ccc2344b.slice/crio-9313e05b5ba18102b86c30e50379d2515585ac847b9747bedde5e9721265ce7d WatchSource:0}: Error finding container 9313e05b5ba18102b86c30e50379d2515585ac847b9747bedde5e9721265ce7d: Status 404 returned error can't find the container with id 9313e05b5ba18102b86c30e50379d2515585ac847b9747bedde5e9721265ce7d Dec 03 09:01:25 crc kubenswrapper[4475]: I1203 09:01:25.595332 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7g4zd"] Dec 03 09:01:26 crc kubenswrapper[4475]: I1203 09:01:26.176377 4475 generic.go:334] "Generic (PLEG): container finished" podID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerID="7bd79a346de469eafb26f96639e5b73e17144d77e907dc4fb9c6f129fa25cf56" exitCode=0 Dec 03 09:01:26 crc kubenswrapper[4475]: I1203 09:01:26.176476 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g4zd" event={"ID":"cbeeabad-22f8-4623-b997-6336ccc2344b","Type":"ContainerDied","Data":"7bd79a346de469eafb26f96639e5b73e17144d77e907dc4fb9c6f129fa25cf56"} Dec 03 09:01:26 crc kubenswrapper[4475]: I1203 09:01:26.176690 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g4zd" event={"ID":"cbeeabad-22f8-4623-b997-6336ccc2344b","Type":"ContainerStarted","Data":"9313e05b5ba18102b86c30e50379d2515585ac847b9747bedde5e9721265ce7d"} Dec 03 09:01:27 crc kubenswrapper[4475]: I1203 09:01:27.189910 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g4zd" event={"ID":"cbeeabad-22f8-4623-b997-6336ccc2344b","Type":"ContainerStarted","Data":"ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526"} Dec 03 09:01:28 crc kubenswrapper[4475]: I1203 09:01:28.933734 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:01:28 crc kubenswrapper[4475]: I1203 09:01:28.934577 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:01:28 crc kubenswrapper[4475]: I1203 09:01:28.934663 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 09:01:28 crc kubenswrapper[4475]: I1203 09:01:28.936129 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:01:28 crc kubenswrapper[4475]: I1203 09:01:28.936227 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" gracePeriod=600 Dec 03 09:01:29 crc kubenswrapper[4475]: E1203 09:01:29.070129 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:01:29 crc kubenswrapper[4475]: I1203 09:01:29.230068 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" exitCode=0 Dec 03 09:01:29 crc kubenswrapper[4475]: I1203 09:01:29.230252 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e"} Dec 03 09:01:29 crc kubenswrapper[4475]: I1203 09:01:29.230357 4475 scope.go:117] "RemoveContainer" containerID="e5895b56be7ae3741bf08c7dedf21b20f02efb3fd61a0868e509423412769f8f" Dec 03 09:01:29 crc kubenswrapper[4475]: I1203 09:01:29.231328 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:01:29 crc kubenswrapper[4475]: E1203 09:01:29.231897 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:01:30 crc kubenswrapper[4475]: I1203 09:01:30.245205 4475 generic.go:334] "Generic (PLEG): container finished" podID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerID="ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526" exitCode=0 Dec 03 09:01:30 crc kubenswrapper[4475]: I1203 09:01:30.245289 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g4zd" event={"ID":"cbeeabad-22f8-4623-b997-6336ccc2344b","Type":"ContainerDied","Data":"ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526"} Dec 03 09:01:31 crc kubenswrapper[4475]: I1203 09:01:31.260322 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g4zd" event={"ID":"cbeeabad-22f8-4623-b997-6336ccc2344b","Type":"ContainerStarted","Data":"ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4"} Dec 03 09:01:31 crc kubenswrapper[4475]: I1203 09:01:31.289730 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7g4zd" podStartSLOduration=2.694920551 podStartE2EDuration="7.289708329s" podCreationTimestamp="2025-12-03 09:01:24 +0000 UTC" firstStartedPulling="2025-12-03 09:01:26.178161277 +0000 UTC m=+8170.983059611" lastFinishedPulling="2025-12-03 09:01:30.772949056 +0000 UTC m=+8175.577847389" observedRunningTime="2025-12-03 09:01:31.283095535 +0000 UTC m=+8176.087993869" watchObservedRunningTime="2025-12-03 09:01:31.289708329 +0000 UTC m=+8176.094606662" Dec 03 09:01:35 crc kubenswrapper[4475]: I1203 09:01:35.034277 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:35 crc kubenswrapper[4475]: I1203 09:01:35.035094 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:36 crc kubenswrapper[4475]: I1203 09:01:36.074145 4475 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g4zd" podUID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerName="registry-server" probeResult="failure" output=< Dec 03 09:01:36 crc kubenswrapper[4475]: timeout: failed to connect service ":50051" within 1s Dec 03 09:01:36 crc kubenswrapper[4475]: > Dec 03 09:01:40 crc kubenswrapper[4475]: I1203 09:01:40.491149 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:01:40 crc kubenswrapper[4475]: E1203 09:01:40.491903 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:01:45 crc kubenswrapper[4475]: I1203 09:01:45.077111 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:45 crc kubenswrapper[4475]: I1203 09:01:45.127276 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:45 crc kubenswrapper[4475]: I1203 09:01:45.324296 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7g4zd"] Dec 03 09:01:46 crc kubenswrapper[4475]: I1203 09:01:46.411849 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7g4zd" podUID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerName="registry-server" containerID="cri-o://ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4" gracePeriod=2 Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.000592 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.003829 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfc7m\" (UniqueName: \"kubernetes.io/projected/cbeeabad-22f8-4623-b997-6336ccc2344b-kube-api-access-lfc7m\") pod \"cbeeabad-22f8-4623-b997-6336ccc2344b\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.003944 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-utilities\") pod \"cbeeabad-22f8-4623-b997-6336ccc2344b\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.003976 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-catalog-content\") pod \"cbeeabad-22f8-4623-b997-6336ccc2344b\" (UID: \"cbeeabad-22f8-4623-b997-6336ccc2344b\") " Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.005724 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-utilities" (OuterVolumeSpecName: "utilities") pod "cbeeabad-22f8-4623-b997-6336ccc2344b" (UID: "cbeeabad-22f8-4623-b997-6336ccc2344b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.040278 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbeeabad-22f8-4623-b997-6336ccc2344b-kube-api-access-lfc7m" (OuterVolumeSpecName: "kube-api-access-lfc7m") pod "cbeeabad-22f8-4623-b997-6336ccc2344b" (UID: "cbeeabad-22f8-4623-b997-6336ccc2344b"). InnerVolumeSpecName "kube-api-access-lfc7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.106653 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfc7m\" (UniqueName: \"kubernetes.io/projected/cbeeabad-22f8-4623-b997-6336ccc2344b-kube-api-access-lfc7m\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.106918 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.109596 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbeeabad-22f8-4623-b997-6336ccc2344b" (UID: "cbeeabad-22f8-4623-b997-6336ccc2344b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.208812 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbeeabad-22f8-4623-b997-6336ccc2344b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.426868 4475 generic.go:334] "Generic (PLEG): container finished" podID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerID="ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4" exitCode=0 Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.426949 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g4zd" event={"ID":"cbeeabad-22f8-4623-b997-6336ccc2344b","Type":"ContainerDied","Data":"ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4"} Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.426996 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g4zd" event={"ID":"cbeeabad-22f8-4623-b997-6336ccc2344b","Type":"ContainerDied","Data":"9313e05b5ba18102b86c30e50379d2515585ac847b9747bedde5e9721265ce7d"} Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.427023 4475 scope.go:117] "RemoveContainer" containerID="ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.427209 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g4zd" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.472461 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7g4zd"] Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.481830 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7g4zd"] Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.487123 4475 scope.go:117] "RemoveContainer" containerID="ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.504029 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbeeabad-22f8-4623-b997-6336ccc2344b" path="/var/lib/kubelet/pods/cbeeabad-22f8-4623-b997-6336ccc2344b/volumes" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.515483 4475 scope.go:117] "RemoveContainer" containerID="7bd79a346de469eafb26f96639e5b73e17144d77e907dc4fb9c6f129fa25cf56" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.553150 4475 scope.go:117] "RemoveContainer" containerID="ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4" Dec 03 09:01:47 crc kubenswrapper[4475]: E1203 09:01:47.553859 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4\": container with ID starting with ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4 not found: ID does not exist" containerID="ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.553916 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4"} err="failed to get container status \"ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4\": rpc error: code = NotFound desc = could not find container \"ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4\": container with ID starting with ae2d417968e491de6a85e199b329a6da6468e7af7dbaea3fbb2823b47e9095c4 not found: ID does not exist" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.553959 4475 scope.go:117] "RemoveContainer" containerID="ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526" Dec 03 09:01:47 crc kubenswrapper[4475]: E1203 09:01:47.554284 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526\": container with ID starting with ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526 not found: ID does not exist" containerID="ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.554307 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526"} err="failed to get container status \"ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526\": rpc error: code = NotFound desc = could not find container \"ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526\": container with ID starting with ba8f9a6123ea956c427801f4fab899a5dd825e0a6039db51dec88be0dcb63526 not found: ID does not exist" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.554326 4475 scope.go:117] "RemoveContainer" containerID="7bd79a346de469eafb26f96639e5b73e17144d77e907dc4fb9c6f129fa25cf56" Dec 03 09:01:47 crc kubenswrapper[4475]: E1203 09:01:47.554961 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd79a346de469eafb26f96639e5b73e17144d77e907dc4fb9c6f129fa25cf56\": container with ID starting with 7bd79a346de469eafb26f96639e5b73e17144d77e907dc4fb9c6f129fa25cf56 not found: ID does not exist" containerID="7bd79a346de469eafb26f96639e5b73e17144d77e907dc4fb9c6f129fa25cf56" Dec 03 09:01:47 crc kubenswrapper[4475]: I1203 09:01:47.555153 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd79a346de469eafb26f96639e5b73e17144d77e907dc4fb9c6f129fa25cf56"} err="failed to get container status \"7bd79a346de469eafb26f96639e5b73e17144d77e907dc4fb9c6f129fa25cf56\": rpc error: code = NotFound desc = could not find container \"7bd79a346de469eafb26f96639e5b73e17144d77e907dc4fb9c6f129fa25cf56\": container with ID starting with 7bd79a346de469eafb26f96639e5b73e17144d77e907dc4fb9c6f129fa25cf56 not found: ID does not exist" Dec 03 09:01:54 crc kubenswrapper[4475]: I1203 09:01:54.492588 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:01:54 crc kubenswrapper[4475]: E1203 09:01:54.493575 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:02:05 crc kubenswrapper[4475]: I1203 09:02:05.500207 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:02:05 crc kubenswrapper[4475]: E1203 09:02:05.501136 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:02:17 crc kubenswrapper[4475]: I1203 09:02:17.491759 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:02:17 crc kubenswrapper[4475]: E1203 09:02:17.492718 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:02:32 crc kubenswrapper[4475]: I1203 09:02:32.491388 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:02:32 crc kubenswrapper[4475]: E1203 09:02:32.492311 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:02:43 crc kubenswrapper[4475]: I1203 09:02:43.491296 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:02:43 crc kubenswrapper[4475]: E1203 09:02:43.492288 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:02:54 crc kubenswrapper[4475]: I1203 09:02:54.491970 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:02:54 crc kubenswrapper[4475]: E1203 09:02:54.493050 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:03:07 crc kubenswrapper[4475]: I1203 09:03:07.492250 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:03:07 crc kubenswrapper[4475]: E1203 09:03:07.493403 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:03:21 crc kubenswrapper[4475]: I1203 09:03:21.491148 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:03:21 crc kubenswrapper[4475]: E1203 09:03:21.492178 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:03:36 crc kubenswrapper[4475]: I1203 09:03:36.491570 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:03:36 crc kubenswrapper[4475]: E1203 09:03:36.492937 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:03:49 crc kubenswrapper[4475]: I1203 09:03:49.491793 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:03:49 crc kubenswrapper[4475]: E1203 09:03:49.492885 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:04:00 crc kubenswrapper[4475]: I1203 09:04:00.491321 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:04:00 crc kubenswrapper[4475]: E1203 09:04:00.492148 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:04:14 crc kubenswrapper[4475]: I1203 09:04:14.491965 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:04:14 crc kubenswrapper[4475]: E1203 09:04:14.493039 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:04:28 crc kubenswrapper[4475]: I1203 09:04:28.493195 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:04:28 crc kubenswrapper[4475]: E1203 09:04:28.494279 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:04:42 crc kubenswrapper[4475]: I1203 09:04:42.490924 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:04:42 crc kubenswrapper[4475]: E1203 09:04:42.491846 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:04:56 crc kubenswrapper[4475]: I1203 09:04:56.491811 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:04:56 crc kubenswrapper[4475]: E1203 09:04:56.492646 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:05:07 crc kubenswrapper[4475]: I1203 09:05:07.491375 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:05:07 crc kubenswrapper[4475]: E1203 09:05:07.492561 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:05:18 crc kubenswrapper[4475]: I1203 09:05:18.492252 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:05:18 crc kubenswrapper[4475]: E1203 09:05:18.493331 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:05:30 crc kubenswrapper[4475]: I1203 09:05:30.492346 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:05:30 crc kubenswrapper[4475]: E1203 09:05:30.493627 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.433235 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dll7c"] Dec 03 09:05:37 crc kubenswrapper[4475]: E1203 09:05:37.434812 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerName="registry-server" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.434835 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerName="registry-server" Dec 03 09:05:37 crc kubenswrapper[4475]: E1203 09:05:37.434866 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerName="extract-content" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.434873 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerName="extract-content" Dec 03 09:05:37 crc kubenswrapper[4475]: E1203 09:05:37.434886 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerName="extract-utilities" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.434893 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerName="extract-utilities" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.435078 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbeeabad-22f8-4623-b997-6336ccc2344b" containerName="registry-server" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.438091 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.448522 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-utilities\") pod \"redhat-marketplace-dll7c\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.448559 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-catalog-content\") pod \"redhat-marketplace-dll7c\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.448671 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xf8g\" (UniqueName: \"kubernetes.io/projected/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-kube-api-access-8xf8g\") pod \"redhat-marketplace-dll7c\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.462330 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dll7c"] Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.550549 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xf8g\" (UniqueName: \"kubernetes.io/projected/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-kube-api-access-8xf8g\") pod \"redhat-marketplace-dll7c\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.550966 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-utilities\") pod \"redhat-marketplace-dll7c\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.551002 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-catalog-content\") pod \"redhat-marketplace-dll7c\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.551492 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-utilities\") pod \"redhat-marketplace-dll7c\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.551607 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-catalog-content\") pod \"redhat-marketplace-dll7c\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.579708 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xf8g\" (UniqueName: \"kubernetes.io/projected/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-kube-api-access-8xf8g\") pod \"redhat-marketplace-dll7c\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:37 crc kubenswrapper[4475]: I1203 09:05:37.763625 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:38 crc kubenswrapper[4475]: I1203 09:05:38.337574 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dll7c"] Dec 03 09:05:38 crc kubenswrapper[4475]: I1203 09:05:38.651000 4475 generic.go:334] "Generic (PLEG): container finished" podID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" containerID="ed7a8e1c5541504d960e5868122c949607b0ef8217d435f52305c29de070d0bb" exitCode=0 Dec 03 09:05:38 crc kubenswrapper[4475]: I1203 09:05:38.651101 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dll7c" event={"ID":"776e38aa-4bc4-408a-9ea4-7a7ce83d510f","Type":"ContainerDied","Data":"ed7a8e1c5541504d960e5868122c949607b0ef8217d435f52305c29de070d0bb"} Dec 03 09:05:38 crc kubenswrapper[4475]: I1203 09:05:38.651606 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dll7c" event={"ID":"776e38aa-4bc4-408a-9ea4-7a7ce83d510f","Type":"ContainerStarted","Data":"f9c58e20cc21cd1b6244a0b652e3c1a25f98122e4952388fa53ddadbd6e5b0c7"} Dec 03 09:05:38 crc kubenswrapper[4475]: I1203 09:05:38.656342 4475 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:05:40 crc kubenswrapper[4475]: I1203 09:05:40.673178 4475 generic.go:334] "Generic (PLEG): container finished" podID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" containerID="84ab05b1672d8124bf62f0d4d792736acb1034ebe42c4b7fd362dfbf9f94076c" exitCode=0 Dec 03 09:05:40 crc kubenswrapper[4475]: I1203 09:05:40.673245 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dll7c" event={"ID":"776e38aa-4bc4-408a-9ea4-7a7ce83d510f","Type":"ContainerDied","Data":"84ab05b1672d8124bf62f0d4d792736acb1034ebe42c4b7fd362dfbf9f94076c"} Dec 03 09:05:41 crc kubenswrapper[4475]: I1203 09:05:41.688312 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dll7c" event={"ID":"776e38aa-4bc4-408a-9ea4-7a7ce83d510f","Type":"ContainerStarted","Data":"20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4"} Dec 03 09:05:41 crc kubenswrapper[4475]: I1203 09:05:41.727266 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dll7c" podStartSLOduration=2.099103349 podStartE2EDuration="4.726434433s" podCreationTimestamp="2025-12-03 09:05:37 +0000 UTC" firstStartedPulling="2025-12-03 09:05:38.653504931 +0000 UTC m=+8423.458403266" lastFinishedPulling="2025-12-03 09:05:41.280836026 +0000 UTC m=+8426.085734350" observedRunningTime="2025-12-03 09:05:41.714545448 +0000 UTC m=+8426.519443792" watchObservedRunningTime="2025-12-03 09:05:41.726434433 +0000 UTC m=+8426.531332766" Dec 03 09:05:45 crc kubenswrapper[4475]: I1203 09:05:45.501926 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:05:45 crc kubenswrapper[4475]: E1203 09:05:45.503109 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:05:47 crc kubenswrapper[4475]: I1203 09:05:47.764217 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:47 crc kubenswrapper[4475]: I1203 09:05:47.764915 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:47 crc kubenswrapper[4475]: I1203 09:05:47.805557 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:48 crc kubenswrapper[4475]: I1203 09:05:48.820610 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:48 crc kubenswrapper[4475]: I1203 09:05:48.866665 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dll7c"] Dec 03 09:05:50 crc kubenswrapper[4475]: I1203 09:05:50.800557 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dll7c" podUID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" containerName="registry-server" containerID="cri-o://20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4" gracePeriod=2 Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.300184 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.412122 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-catalog-content\") pod \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.412384 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-utilities\") pod \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.412548 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xf8g\" (UniqueName: \"kubernetes.io/projected/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-kube-api-access-8xf8g\") pod \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\" (UID: \"776e38aa-4bc4-408a-9ea4-7a7ce83d510f\") " Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.412970 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-utilities" (OuterVolumeSpecName: "utilities") pod "776e38aa-4bc4-408a-9ea4-7a7ce83d510f" (UID: "776e38aa-4bc4-408a-9ea4-7a7ce83d510f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.414035 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.423314 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-kube-api-access-8xf8g" (OuterVolumeSpecName: "kube-api-access-8xf8g") pod "776e38aa-4bc4-408a-9ea4-7a7ce83d510f" (UID: "776e38aa-4bc4-408a-9ea4-7a7ce83d510f"). InnerVolumeSpecName "kube-api-access-8xf8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.430515 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "776e38aa-4bc4-408a-9ea4-7a7ce83d510f" (UID: "776e38aa-4bc4-408a-9ea4-7a7ce83d510f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.516532 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.516623 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xf8g\" (UniqueName: \"kubernetes.io/projected/776e38aa-4bc4-408a-9ea4-7a7ce83d510f-kube-api-access-8xf8g\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.815300 4475 generic.go:334] "Generic (PLEG): container finished" podID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" containerID="20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4" exitCode=0 Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.815395 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dll7c" event={"ID":"776e38aa-4bc4-408a-9ea4-7a7ce83d510f","Type":"ContainerDied","Data":"20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4"} Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.815469 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dll7c" event={"ID":"776e38aa-4bc4-408a-9ea4-7a7ce83d510f","Type":"ContainerDied","Data":"f9c58e20cc21cd1b6244a0b652e3c1a25f98122e4952388fa53ddadbd6e5b0c7"} Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.815498 4475 scope.go:117] "RemoveContainer" containerID="20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.815739 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dll7c" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.849018 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dll7c"] Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.850412 4475 scope.go:117] "RemoveContainer" containerID="84ab05b1672d8124bf62f0d4d792736acb1034ebe42c4b7fd362dfbf9f94076c" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.857606 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dll7c"] Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.883490 4475 scope.go:117] "RemoveContainer" containerID="ed7a8e1c5541504d960e5868122c949607b0ef8217d435f52305c29de070d0bb" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.913123 4475 scope.go:117] "RemoveContainer" containerID="20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4" Dec 03 09:05:51 crc kubenswrapper[4475]: E1203 09:05:51.913941 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4\": container with ID starting with 20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4 not found: ID does not exist" containerID="20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.914032 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4"} err="failed to get container status \"20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4\": rpc error: code = NotFound desc = could not find container \"20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4\": container with ID starting with 20e0f567fed7fd11ed8d2578f8d785d5ab189eec819b14fb035994858a6a5aa4 not found: ID does not exist" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.914059 4475 scope.go:117] "RemoveContainer" containerID="84ab05b1672d8124bf62f0d4d792736acb1034ebe42c4b7fd362dfbf9f94076c" Dec 03 09:05:51 crc kubenswrapper[4475]: E1203 09:05:51.914575 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ab05b1672d8124bf62f0d4d792736acb1034ebe42c4b7fd362dfbf9f94076c\": container with ID starting with 84ab05b1672d8124bf62f0d4d792736acb1034ebe42c4b7fd362dfbf9f94076c not found: ID does not exist" containerID="84ab05b1672d8124bf62f0d4d792736acb1034ebe42c4b7fd362dfbf9f94076c" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.914599 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ab05b1672d8124bf62f0d4d792736acb1034ebe42c4b7fd362dfbf9f94076c"} err="failed to get container status \"84ab05b1672d8124bf62f0d4d792736acb1034ebe42c4b7fd362dfbf9f94076c\": rpc error: code = NotFound desc = could not find container \"84ab05b1672d8124bf62f0d4d792736acb1034ebe42c4b7fd362dfbf9f94076c\": container with ID starting with 84ab05b1672d8124bf62f0d4d792736acb1034ebe42c4b7fd362dfbf9f94076c not found: ID does not exist" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.914618 4475 scope.go:117] "RemoveContainer" containerID="ed7a8e1c5541504d960e5868122c949607b0ef8217d435f52305c29de070d0bb" Dec 03 09:05:51 crc kubenswrapper[4475]: E1203 09:05:51.914905 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7a8e1c5541504d960e5868122c949607b0ef8217d435f52305c29de070d0bb\": container with ID starting with ed7a8e1c5541504d960e5868122c949607b0ef8217d435f52305c29de070d0bb not found: ID does not exist" containerID="ed7a8e1c5541504d960e5868122c949607b0ef8217d435f52305c29de070d0bb" Dec 03 09:05:51 crc kubenswrapper[4475]: I1203 09:05:51.914970 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7a8e1c5541504d960e5868122c949607b0ef8217d435f52305c29de070d0bb"} err="failed to get container status \"ed7a8e1c5541504d960e5868122c949607b0ef8217d435f52305c29de070d0bb\": rpc error: code = NotFound desc = could not find container \"ed7a8e1c5541504d960e5868122c949607b0ef8217d435f52305c29de070d0bb\": container with ID starting with ed7a8e1c5541504d960e5868122c949607b0ef8217d435f52305c29de070d0bb not found: ID does not exist" Dec 03 09:05:53 crc kubenswrapper[4475]: I1203 09:05:53.504019 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" path="/var/lib/kubelet/pods/776e38aa-4bc4-408a-9ea4-7a7ce83d510f/volumes" Dec 03 09:05:59 crc kubenswrapper[4475]: I1203 09:05:59.491580 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:05:59 crc kubenswrapper[4475]: E1203 09:05:59.492649 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:06:14 crc kubenswrapper[4475]: I1203 09:06:14.491639 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:06:14 crc kubenswrapper[4475]: E1203 09:06:14.492625 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:06:25 crc kubenswrapper[4475]: I1203 09:06:25.497988 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:06:25 crc kubenswrapper[4475]: E1203 09:06:25.499045 4475 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjbzg_openshift-machine-config-operator(91aee7be-4a52-4598-803f-2deebe0674de)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" Dec 03 09:06:36 crc kubenswrapper[4475]: I1203 09:06:36.491692 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:06:37 crc kubenswrapper[4475]: I1203 09:06:37.283963 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"da32e18d3701de940011b11f6bd99935758f9c3d1bb12cd0c1556f570ad33969"} Dec 03 09:08:58 crc kubenswrapper[4475]: I1203 09:08:58.934653 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:08:58 crc kubenswrapper[4475]: I1203 09:08:58.937709 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:09:14 crc kubenswrapper[4475]: I1203 09:09:14.777270 4475 generic.go:334] "Generic (PLEG): container finished" podID="8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" containerID="e1b55e8417cc886ea0d3ab3d515efa2894df64e4a2ff1f75649d997f40d2e875" exitCode=0 Dec 03 09:09:14 crc kubenswrapper[4475]: I1203 09:09:14.777376 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a","Type":"ContainerDied","Data":"e1b55e8417cc886ea0d3ab3d515efa2894df64e4a2ff1f75649d997f40d2e875"} Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.617633 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.797630 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a","Type":"ContainerDied","Data":"d8699d2f7a543defdc669dd6390b09abe40b457bf887363c0f50fc6094f28eb2"} Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.797694 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.797700 4475 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8699d2f7a543defdc669dd6390b09abe40b457bf887363c0f50fc6094f28eb2" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.810977 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ca-certs\") pod \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.811174 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ssh-key\") pod \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.811209 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qs8h\" (UniqueName: \"kubernetes.io/projected/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-kube-api-access-7qs8h\") pod \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.811688 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-temporary\") pod \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.811748 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config-secret\") pod \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.811819 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-workdir\") pod \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.811864 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-config-data\") pod \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.811893 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config\") pod \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.811943 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\" (UID: \"8c152b07-dbe2-44d8-a333-ceff4d6e7f9a\") " Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.812346 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" (UID: "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.812825 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-config-data" (OuterVolumeSpecName: "config-data") pod "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" (UID: "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.814785 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" (UID: "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.830647 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-kube-api-access-7qs8h" (OuterVolumeSpecName: "kube-api-access-7qs8h") pod "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" (UID: "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a"). InnerVolumeSpecName "kube-api-access-7qs8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.840312 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" (UID: "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.852744 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" (UID: "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.864755 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" (UID: "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.870040 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" (UID: "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.876816 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" (UID: "8c152b07-dbe2-44d8-a333-ceff4d6e7f9a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.915351 4475 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.915387 4475 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.915400 4475 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.915412 4475 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.915425 4475 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.916078 4475 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.916094 4475 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.916105 4475 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.916117 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qs8h\" (UniqueName: \"kubernetes.io/projected/8c152b07-dbe2-44d8-a333-ceff4d6e7f9a-kube-api-access-7qs8h\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:16 crc kubenswrapper[4475]: I1203 09:09:16.934527 4475 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 03 09:09:17 crc kubenswrapper[4475]: I1203 09:09:17.018011 4475 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:18 crc kubenswrapper[4475]: I1203 09:09:18.880809 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-55k65"] Dec 03 09:09:18 crc kubenswrapper[4475]: E1203 09:09:18.881579 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" containerName="registry-server" Dec 03 09:09:18 crc kubenswrapper[4475]: I1203 09:09:18.881600 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" containerName="registry-server" Dec 03 09:09:18 crc kubenswrapper[4475]: E1203 09:09:18.881627 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" containerName="tempest-tests-tempest-tests-runner" Dec 03 09:09:18 crc kubenswrapper[4475]: I1203 09:09:18.881633 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" containerName="tempest-tests-tempest-tests-runner" Dec 03 09:09:18 crc kubenswrapper[4475]: E1203 09:09:18.881672 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" containerName="extract-utilities" Dec 03 09:09:18 crc kubenswrapper[4475]: I1203 09:09:18.881680 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" containerName="extract-utilities" Dec 03 09:09:18 crc kubenswrapper[4475]: E1203 09:09:18.881695 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" containerName="extract-content" Dec 03 09:09:18 crc kubenswrapper[4475]: I1203 09:09:18.881703 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" containerName="extract-content" Dec 03 09:09:18 crc kubenswrapper[4475]: I1203 09:09:18.881910 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="776e38aa-4bc4-408a-9ea4-7a7ce83d510f" containerName="registry-server" Dec 03 09:09:18 crc kubenswrapper[4475]: I1203 09:09:18.881934 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c152b07-dbe2-44d8-a333-ceff4d6e7f9a" containerName="tempest-tests-tempest-tests-runner" Dec 03 09:09:18 crc kubenswrapper[4475]: I1203 09:09:18.883914 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:18 crc kubenswrapper[4475]: I1203 09:09:18.901351 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55k65"] Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.070038 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-utilities\") pod \"certified-operators-55k65\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.070134 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-catalog-content\") pod \"certified-operators-55k65\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.070296 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt56l\" (UniqueName: \"kubernetes.io/projected/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-kube-api-access-lt56l\") pod \"certified-operators-55k65\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.173445 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-utilities\") pod \"certified-operators-55k65\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.173565 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-catalog-content\") pod \"certified-operators-55k65\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.173673 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt56l\" (UniqueName: \"kubernetes.io/projected/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-kube-api-access-lt56l\") pod \"certified-operators-55k65\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.174099 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-catalog-content\") pod \"certified-operators-55k65\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.174207 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-utilities\") pod \"certified-operators-55k65\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.192373 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt56l\" (UniqueName: \"kubernetes.io/projected/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-kube-api-access-lt56l\") pod \"certified-operators-55k65\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.201954 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.676379 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55k65"] Dec 03 09:09:19 crc kubenswrapper[4475]: I1203 09:09:19.825419 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55k65" event={"ID":"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5","Type":"ContainerStarted","Data":"d0a32bc23d2694f5aa154aec29c1d3ba7b33d628044a3ae8f0da4631bbb86e02"} Dec 03 09:09:20 crc kubenswrapper[4475]: I1203 09:09:20.837838 4475 generic.go:334] "Generic (PLEG): container finished" podID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" containerID="ca03de5bc3731a9236ff5c66280193c15787d97f4c81d68cdb83cc781590c6a5" exitCode=0 Dec 03 09:09:20 crc kubenswrapper[4475]: I1203 09:09:20.837947 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55k65" event={"ID":"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5","Type":"ContainerDied","Data":"ca03de5bc3731a9236ff5c66280193c15787d97f4c81d68cdb83cc781590c6a5"} Dec 03 09:09:21 crc kubenswrapper[4475]: I1203 09:09:21.850297 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55k65" event={"ID":"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5","Type":"ContainerStarted","Data":"efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de"} Dec 03 09:09:22 crc kubenswrapper[4475]: I1203 09:09:22.859777 4475 generic.go:334] "Generic (PLEG): container finished" podID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" containerID="efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de" exitCode=0 Dec 03 09:09:22 crc kubenswrapper[4475]: I1203 09:09:22.859827 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55k65" event={"ID":"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5","Type":"ContainerDied","Data":"efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de"} Dec 03 09:09:23 crc kubenswrapper[4475]: I1203 09:09:23.874889 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55k65" event={"ID":"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5","Type":"ContainerStarted","Data":"26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb"} Dec 03 09:09:23 crc kubenswrapper[4475]: I1203 09:09:23.901710 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-55k65" podStartSLOduration=3.227232919 podStartE2EDuration="5.901647764s" podCreationTimestamp="2025-12-03 09:09:18 +0000 UTC" firstStartedPulling="2025-12-03 09:09:20.841163502 +0000 UTC m=+8645.646061837" lastFinishedPulling="2025-12-03 09:09:23.515578348 +0000 UTC m=+8648.320476682" observedRunningTime="2025-12-03 09:09:23.896123547 +0000 UTC m=+8648.701021881" watchObservedRunningTime="2025-12-03 09:09:23.901647764 +0000 UTC m=+8648.706546098" Dec 03 09:09:26 crc kubenswrapper[4475]: I1203 09:09:26.729153 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 09:09:26 crc kubenswrapper[4475]: I1203 09:09:26.731850 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 09:09:26 crc kubenswrapper[4475]: I1203 09:09:26.740781 4475 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dclp7" Dec 03 09:09:26 crc kubenswrapper[4475]: I1203 09:09:26.740856 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 09:09:26 crc kubenswrapper[4475]: I1203 09:09:26.851298 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg8zp\" (UniqueName: \"kubernetes.io/projected/030ad765-df8f-4f56-8964-b0a2cc4bfc7d-kube-api-access-xg8zp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"030ad765-df8f-4f56-8964-b0a2cc4bfc7d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 09:09:26 crc kubenswrapper[4475]: I1203 09:09:26.851404 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"030ad765-df8f-4f56-8964-b0a2cc4bfc7d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 09:09:26 crc kubenswrapper[4475]: I1203 09:09:26.954524 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"030ad765-df8f-4f56-8964-b0a2cc4bfc7d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 09:09:26 crc kubenswrapper[4475]: I1203 09:09:26.955114 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg8zp\" (UniqueName: \"kubernetes.io/projected/030ad765-df8f-4f56-8964-b0a2cc4bfc7d-kube-api-access-xg8zp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"030ad765-df8f-4f56-8964-b0a2cc4bfc7d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 09:09:26 crc kubenswrapper[4475]: I1203 09:09:26.956995 4475 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"030ad765-df8f-4f56-8964-b0a2cc4bfc7d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 09:09:26 crc kubenswrapper[4475]: I1203 09:09:26.979757 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg8zp\" (UniqueName: \"kubernetes.io/projected/030ad765-df8f-4f56-8964-b0a2cc4bfc7d-kube-api-access-xg8zp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"030ad765-df8f-4f56-8964-b0a2cc4bfc7d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 09:09:26 crc kubenswrapper[4475]: I1203 09:09:26.986130 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"030ad765-df8f-4f56-8964-b0a2cc4bfc7d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 09:09:27 crc kubenswrapper[4475]: I1203 09:09:27.052786 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 09:09:27 crc kubenswrapper[4475]: I1203 09:09:27.505774 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 09:09:27 crc kubenswrapper[4475]: W1203 09:09:27.518396 4475 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod030ad765_df8f_4f56_8964_b0a2cc4bfc7d.slice/crio-417318bd34630e0625d524125d4b191789db02c985065e9dc070d24b12788f9b WatchSource:0}: Error finding container 417318bd34630e0625d524125d4b191789db02c985065e9dc070d24b12788f9b: Status 404 returned error can't find the container with id 417318bd34630e0625d524125d4b191789db02c985065e9dc070d24b12788f9b Dec 03 09:09:27 crc kubenswrapper[4475]: I1203 09:09:27.920691 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"030ad765-df8f-4f56-8964-b0a2cc4bfc7d","Type":"ContainerStarted","Data":"417318bd34630e0625d524125d4b191789db02c985065e9dc070d24b12788f9b"} Dec 03 09:09:28 crc kubenswrapper[4475]: I1203 09:09:28.931366 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"030ad765-df8f-4f56-8964-b0a2cc4bfc7d","Type":"ContainerStarted","Data":"b5e3e2d11ef1263e01ea75c8b1134cf4eda46643c624cbe75e9d5080083033c4"} Dec 03 09:09:28 crc kubenswrapper[4475]: I1203 09:09:28.933622 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:09:28 crc kubenswrapper[4475]: I1203 09:09:28.933670 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:09:28 crc kubenswrapper[4475]: I1203 09:09:28.952941 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.964902071 podStartE2EDuration="2.952918628s" podCreationTimestamp="2025-12-03 09:09:26 +0000 UTC" firstStartedPulling="2025-12-03 09:09:27.52267808 +0000 UTC m=+8652.327576404" lastFinishedPulling="2025-12-03 09:09:28.510694628 +0000 UTC m=+8653.315592961" observedRunningTime="2025-12-03 09:09:28.944940117 +0000 UTC m=+8653.749838450" watchObservedRunningTime="2025-12-03 09:09:28.952918628 +0000 UTC m=+8653.757816962" Dec 03 09:09:29 crc kubenswrapper[4475]: I1203 09:09:29.202872 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:29 crc kubenswrapper[4475]: I1203 09:09:29.203136 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:29 crc kubenswrapper[4475]: I1203 09:09:29.253196 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:29 crc kubenswrapper[4475]: I1203 09:09:29.988184 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:30 crc kubenswrapper[4475]: I1203 09:09:30.051070 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55k65"] Dec 03 09:09:31 crc kubenswrapper[4475]: I1203 09:09:31.957956 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-55k65" podUID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" containerName="registry-server" containerID="cri-o://26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb" gracePeriod=2 Dec 03 09:09:32 crc kubenswrapper[4475]: I1203 09:09:32.925345 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:32 crc kubenswrapper[4475]: I1203 09:09:32.978475 4475 generic.go:334] "Generic (PLEG): container finished" podID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" containerID="26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb" exitCode=0 Dec 03 09:09:32 crc kubenswrapper[4475]: I1203 09:09:32.978523 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55k65" event={"ID":"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5","Type":"ContainerDied","Data":"26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb"} Dec 03 09:09:32 crc kubenswrapper[4475]: I1203 09:09:32.978554 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55k65" event={"ID":"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5","Type":"ContainerDied","Data":"d0a32bc23d2694f5aa154aec29c1d3ba7b33d628044a3ae8f0da4631bbb86e02"} Dec 03 09:09:32 crc kubenswrapper[4475]: I1203 09:09:32.978574 4475 scope.go:117] "RemoveContainer" containerID="26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb" Dec 03 09:09:32 crc kubenswrapper[4475]: I1203 09:09:32.978731 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55k65" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.001334 4475 scope.go:117] "RemoveContainer" containerID="efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.009693 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-catalog-content\") pod \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.009843 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-utilities\") pod \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.010109 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt56l\" (UniqueName: \"kubernetes.io/projected/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-kube-api-access-lt56l\") pod \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\" (UID: \"c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5\") " Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.011960 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-utilities" (OuterVolumeSpecName: "utilities") pod "c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" (UID: "c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.017368 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-kube-api-access-lt56l" (OuterVolumeSpecName: "kube-api-access-lt56l") pod "c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" (UID: "c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5"). InnerVolumeSpecName "kube-api-access-lt56l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.021096 4475 scope.go:117] "RemoveContainer" containerID="ca03de5bc3731a9236ff5c66280193c15787d97f4c81d68cdb83cc781590c6a5" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.055037 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" (UID: "c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.087086 4475 scope.go:117] "RemoveContainer" containerID="26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb" Dec 03 09:09:33 crc kubenswrapper[4475]: E1203 09:09:33.087786 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb\": container with ID starting with 26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb not found: ID does not exist" containerID="26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.087821 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb"} err="failed to get container status \"26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb\": rpc error: code = NotFound desc = could not find container \"26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb\": container with ID starting with 26bb1d0f6f07fce5b583e8b2249caf8307b812e89163a1ef50a70e8ab42e14cb not found: ID does not exist" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.087847 4475 scope.go:117] "RemoveContainer" containerID="efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de" Dec 03 09:09:33 crc kubenswrapper[4475]: E1203 09:09:33.088251 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de\": container with ID starting with efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de not found: ID does not exist" containerID="efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.088355 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de"} err="failed to get container status \"efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de\": rpc error: code = NotFound desc = could not find container \"efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de\": container with ID starting with efe9e49317e79b05ce99d8cb0f4fd5a0314a93889819d0cc1abd10111e35e7de not found: ID does not exist" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.088429 4475 scope.go:117] "RemoveContainer" containerID="ca03de5bc3731a9236ff5c66280193c15787d97f4c81d68cdb83cc781590c6a5" Dec 03 09:09:33 crc kubenswrapper[4475]: E1203 09:09:33.088882 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca03de5bc3731a9236ff5c66280193c15787d97f4c81d68cdb83cc781590c6a5\": container with ID starting with ca03de5bc3731a9236ff5c66280193c15787d97f4c81d68cdb83cc781590c6a5 not found: ID does not exist" containerID="ca03de5bc3731a9236ff5c66280193c15787d97f4c81d68cdb83cc781590c6a5" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.088938 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca03de5bc3731a9236ff5c66280193c15787d97f4c81d68cdb83cc781590c6a5"} err="failed to get container status \"ca03de5bc3731a9236ff5c66280193c15787d97f4c81d68cdb83cc781590c6a5\": rpc error: code = NotFound desc = could not find container \"ca03de5bc3731a9236ff5c66280193c15787d97f4c81d68cdb83cc781590c6a5\": container with ID starting with ca03de5bc3731a9236ff5c66280193c15787d97f4c81d68cdb83cc781590c6a5 not found: ID does not exist" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.114244 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.114274 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt56l\" (UniqueName: \"kubernetes.io/projected/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-kube-api-access-lt56l\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.114286 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.311867 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55k65"] Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.323115 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-55k65"] Dec 03 09:09:33 crc kubenswrapper[4475]: I1203 09:09:33.505748 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" path="/var/lib/kubelet/pods/c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5/volumes" Dec 03 09:09:58 crc kubenswrapper[4475]: I1203 09:09:58.933890 4475 patch_prober.go:28] interesting pod/machine-config-daemon-tjbzg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:09:58 crc kubenswrapper[4475]: I1203 09:09:58.934564 4475 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:09:58 crc kubenswrapper[4475]: I1203 09:09:58.934628 4475 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" Dec 03 09:09:58 crc kubenswrapper[4475]: I1203 09:09:58.935897 4475 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da32e18d3701de940011b11f6bd99935758f9c3d1bb12cd0c1556f570ad33969"} pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:09:58 crc kubenswrapper[4475]: I1203 09:09:58.935979 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" podUID="91aee7be-4a52-4598-803f-2deebe0674de" containerName="machine-config-daemon" containerID="cri-o://da32e18d3701de940011b11f6bd99935758f9c3d1bb12cd0c1556f570ad33969" gracePeriod=600 Dec 03 09:09:59 crc kubenswrapper[4475]: I1203 09:09:59.293314 4475 generic.go:334] "Generic (PLEG): container finished" podID="91aee7be-4a52-4598-803f-2deebe0674de" containerID="da32e18d3701de940011b11f6bd99935758f9c3d1bb12cd0c1556f570ad33969" exitCode=0 Dec 03 09:09:59 crc kubenswrapper[4475]: I1203 09:09:59.293372 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerDied","Data":"da32e18d3701de940011b11f6bd99935758f9c3d1bb12cd0c1556f570ad33969"} Dec 03 09:09:59 crc kubenswrapper[4475]: I1203 09:09:59.293635 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjbzg" event={"ID":"91aee7be-4a52-4598-803f-2deebe0674de","Type":"ContainerStarted","Data":"898a67abd1f55732668b71ffe83895682d6d2a3b34dbf582f1c1f4d4663f5f6f"} Dec 03 09:09:59 crc kubenswrapper[4475]: I1203 09:09:59.293678 4475 scope.go:117] "RemoveContainer" containerID="2558f17c1a5d43f43a255acd776f2097a10dc7f4eba4c6f2066cd9710d69b61e" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.253009 4475 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jpwg8"] Dec 03 09:10:31 crc kubenswrapper[4475]: E1203 09:10:31.254386 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" containerName="extract-content" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.254403 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" containerName="extract-content" Dec 03 09:10:31 crc kubenswrapper[4475]: E1203 09:10:31.254424 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" containerName="extract-utilities" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.254430 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" containerName="extract-utilities" Dec 03 09:10:31 crc kubenswrapper[4475]: E1203 09:10:31.254441 4475 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" containerName="registry-server" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.254461 4475 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" containerName="registry-server" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.254801 4475 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ed683e-fc50-41a3-8a74-f24b0a3ef6c5" containerName="registry-server" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.257236 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.273257 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jpwg8"] Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.403622 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-catalog-content\") pod \"community-operators-jpwg8\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.403689 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vsq2\" (UniqueName: \"kubernetes.io/projected/d1b69d25-8842-4587-9a13-6663bc9fa3a2-kube-api-access-7vsq2\") pod \"community-operators-jpwg8\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.403738 4475 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-utilities\") pod \"community-operators-jpwg8\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.506558 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-catalog-content\") pod \"community-operators-jpwg8\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.506624 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vsq2\" (UniqueName: \"kubernetes.io/projected/d1b69d25-8842-4587-9a13-6663bc9fa3a2-kube-api-access-7vsq2\") pod \"community-operators-jpwg8\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.506709 4475 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-utilities\") pod \"community-operators-jpwg8\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.507092 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-catalog-content\") pod \"community-operators-jpwg8\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.507238 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-utilities\") pod \"community-operators-jpwg8\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.540211 4475 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vsq2\" (UniqueName: \"kubernetes.io/projected/d1b69d25-8842-4587-9a13-6663bc9fa3a2-kube-api-access-7vsq2\") pod \"community-operators-jpwg8\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:31 crc kubenswrapper[4475]: I1203 09:10:31.593081 4475 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:32 crc kubenswrapper[4475]: I1203 09:10:32.081942 4475 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jpwg8"] Dec 03 09:10:32 crc kubenswrapper[4475]: I1203 09:10:32.639485 4475 generic.go:334] "Generic (PLEG): container finished" podID="d1b69d25-8842-4587-9a13-6663bc9fa3a2" containerID="ed73e265482a1df0a0843f01b2864bdfaf6500b9ecd584a3ce0c8f60924ba5e9" exitCode=0 Dec 03 09:10:32 crc kubenswrapper[4475]: I1203 09:10:32.639580 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpwg8" event={"ID":"d1b69d25-8842-4587-9a13-6663bc9fa3a2","Type":"ContainerDied","Data":"ed73e265482a1df0a0843f01b2864bdfaf6500b9ecd584a3ce0c8f60924ba5e9"} Dec 03 09:10:32 crc kubenswrapper[4475]: I1203 09:10:32.639821 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpwg8" event={"ID":"d1b69d25-8842-4587-9a13-6663bc9fa3a2","Type":"ContainerStarted","Data":"92c301127325a1e6b9f106028f29f493915a116cea4125e8726239d2adfbf3df"} Dec 03 09:10:33 crc kubenswrapper[4475]: I1203 09:10:33.653853 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpwg8" event={"ID":"d1b69d25-8842-4587-9a13-6663bc9fa3a2","Type":"ContainerStarted","Data":"0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae"} Dec 03 09:10:34 crc kubenswrapper[4475]: I1203 09:10:34.668009 4475 generic.go:334] "Generic (PLEG): container finished" podID="d1b69d25-8842-4587-9a13-6663bc9fa3a2" containerID="0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae" exitCode=0 Dec 03 09:10:34 crc kubenswrapper[4475]: I1203 09:10:34.668100 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpwg8" event={"ID":"d1b69d25-8842-4587-9a13-6663bc9fa3a2","Type":"ContainerDied","Data":"0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae"} Dec 03 09:10:35 crc kubenswrapper[4475]: I1203 09:10:35.683703 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpwg8" event={"ID":"d1b69d25-8842-4587-9a13-6663bc9fa3a2","Type":"ContainerStarted","Data":"8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff"} Dec 03 09:10:35 crc kubenswrapper[4475]: I1203 09:10:35.705399 4475 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jpwg8" podStartSLOduration=2.150939392 podStartE2EDuration="4.705373774s" podCreationTimestamp="2025-12-03 09:10:31 +0000 UTC" firstStartedPulling="2025-12-03 09:10:32.64240955 +0000 UTC m=+8717.447307874" lastFinishedPulling="2025-12-03 09:10:35.196843922 +0000 UTC m=+8720.001742256" observedRunningTime="2025-12-03 09:10:35.701106251 +0000 UTC m=+8720.506004584" watchObservedRunningTime="2025-12-03 09:10:35.705373774 +0000 UTC m=+8720.510272108" Dec 03 09:10:41 crc kubenswrapper[4475]: I1203 09:10:41.593961 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:41 crc kubenswrapper[4475]: I1203 09:10:41.594636 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:41 crc kubenswrapper[4475]: I1203 09:10:41.638965 4475 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:41 crc kubenswrapper[4475]: I1203 09:10:41.783621 4475 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:41 crc kubenswrapper[4475]: I1203 09:10:41.880129 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jpwg8"] Dec 03 09:10:43 crc kubenswrapper[4475]: I1203 09:10:43.762226 4475 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jpwg8" podUID="d1b69d25-8842-4587-9a13-6663bc9fa3a2" containerName="registry-server" containerID="cri-o://8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff" gracePeriod=2 Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.228175 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.346502 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-utilities\") pod \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.346673 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-catalog-content\") pod \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.347158 4475 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vsq2\" (UniqueName: \"kubernetes.io/projected/d1b69d25-8842-4587-9a13-6663bc9fa3a2-kube-api-access-7vsq2\") pod \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\" (UID: \"d1b69d25-8842-4587-9a13-6663bc9fa3a2\") " Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.347162 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-utilities" (OuterVolumeSpecName: "utilities") pod "d1b69d25-8842-4587-9a13-6663bc9fa3a2" (UID: "d1b69d25-8842-4587-9a13-6663bc9fa3a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.348342 4475 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.389780 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b69d25-8842-4587-9a13-6663bc9fa3a2-kube-api-access-7vsq2" (OuterVolumeSpecName: "kube-api-access-7vsq2") pod "d1b69d25-8842-4587-9a13-6663bc9fa3a2" (UID: "d1b69d25-8842-4587-9a13-6663bc9fa3a2"). InnerVolumeSpecName "kube-api-access-7vsq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.394204 4475 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1b69d25-8842-4587-9a13-6663bc9fa3a2" (UID: "d1b69d25-8842-4587-9a13-6663bc9fa3a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.450951 4475 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1b69d25-8842-4587-9a13-6663bc9fa3a2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.450984 4475 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vsq2\" (UniqueName: \"kubernetes.io/projected/d1b69d25-8842-4587-9a13-6663bc9fa3a2-kube-api-access-7vsq2\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.775330 4475 generic.go:334] "Generic (PLEG): container finished" podID="d1b69d25-8842-4587-9a13-6663bc9fa3a2" containerID="8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff" exitCode=0 Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.775439 4475 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jpwg8" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.775435 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpwg8" event={"ID":"d1b69d25-8842-4587-9a13-6663bc9fa3a2","Type":"ContainerDied","Data":"8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff"} Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.775964 4475 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jpwg8" event={"ID":"d1b69d25-8842-4587-9a13-6663bc9fa3a2","Type":"ContainerDied","Data":"92c301127325a1e6b9f106028f29f493915a116cea4125e8726239d2adfbf3df"} Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.775992 4475 scope.go:117] "RemoveContainer" containerID="8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.818175 4475 scope.go:117] "RemoveContainer" containerID="0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.819011 4475 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jpwg8"] Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.829778 4475 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jpwg8"] Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.842993 4475 scope.go:117] "RemoveContainer" containerID="ed73e265482a1df0a0843f01b2864bdfaf6500b9ecd584a3ce0c8f60924ba5e9" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.870843 4475 scope.go:117] "RemoveContainer" containerID="8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff" Dec 03 09:10:44 crc kubenswrapper[4475]: E1203 09:10:44.871689 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff\": container with ID starting with 8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff not found: ID does not exist" containerID="8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.871764 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff"} err="failed to get container status \"8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff\": rpc error: code = NotFound desc = could not find container \"8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff\": container with ID starting with 8da2501b0900c9b2f84a821cf9b9babd9d148fbb780d914875363401b76d08ff not found: ID does not exist" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.871797 4475 scope.go:117] "RemoveContainer" containerID="0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae" Dec 03 09:10:44 crc kubenswrapper[4475]: E1203 09:10:44.872222 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae\": container with ID starting with 0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae not found: ID does not exist" containerID="0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.872257 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae"} err="failed to get container status \"0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae\": rpc error: code = NotFound desc = could not find container \"0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae\": container with ID starting with 0b9609d17cdc3e54b304d45fa037eeffbd07acfc5c7e65e4b53dfdc4f2af0eae not found: ID does not exist" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.872280 4475 scope.go:117] "RemoveContainer" containerID="ed73e265482a1df0a0843f01b2864bdfaf6500b9ecd584a3ce0c8f60924ba5e9" Dec 03 09:10:44 crc kubenswrapper[4475]: E1203 09:10:44.872526 4475 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed73e265482a1df0a0843f01b2864bdfaf6500b9ecd584a3ce0c8f60924ba5e9\": container with ID starting with ed73e265482a1df0a0843f01b2864bdfaf6500b9ecd584a3ce0c8f60924ba5e9 not found: ID does not exist" containerID="ed73e265482a1df0a0843f01b2864bdfaf6500b9ecd584a3ce0c8f60924ba5e9" Dec 03 09:10:44 crc kubenswrapper[4475]: I1203 09:10:44.872554 4475 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed73e265482a1df0a0843f01b2864bdfaf6500b9ecd584a3ce0c8f60924ba5e9"} err="failed to get container status \"ed73e265482a1df0a0843f01b2864bdfaf6500b9ecd584a3ce0c8f60924ba5e9\": rpc error: code = NotFound desc = could not find container \"ed73e265482a1df0a0843f01b2864bdfaf6500b9ecd584a3ce0c8f60924ba5e9\": container with ID starting with ed73e265482a1df0a0843f01b2864bdfaf6500b9ecd584a3ce0c8f60924ba5e9 not found: ID does not exist" Dec 03 09:10:45 crc kubenswrapper[4475]: I1203 09:10:45.503135 4475 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b69d25-8842-4587-9a13-6663bc9fa3a2" path="/var/lib/kubelet/pods/d1b69d25-8842-4587-9a13-6663bc9fa3a2/volumes"